Sample records for construction method selection

  1. Selection of Construction Methods: A Knowledge-Based Approach

    PubMed Central

    Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects. PMID:24453925

  2. Constructs and methods for genome editing and genetic engineering of fungi and protists

    DOEpatents

    Hittinger, Christopher Todd; Alexander, William Gerald

    2018-01-30

    Provided herein are constructs for genome editing or genetic engineering in fungi or protists, methods of using the constructs and media for use in selecting cells. The construct include a polynucleotide encoding a thymidine kinase operably connected to a promoter, suitably a constitutive promoter; a polynucleotide encoding an endonuclease operably connected to an inducible promoter; and a recognition site for the endonuclease. The constructs may also include selectable markers for use in selecting recombinations.

  3. Pharmacy Student Performance on Constructed-Response Versus Selected-Response Calculations Questions

    PubMed Central

    Addo, Richard T.

    2013-01-01

    Objective. To introduce PharmD students to changes in calculations question types (constructed-response versus selected-response questions); measure and compare student performance on constructed-response and selected-response questions in a pharmaceutics course; and collect student feedback on the use of differing question types. Methods A pharmaceutics/pharmaceutical calculations examination was administered that included 15 pairs of questions; each pair consisted of a constructed-response question and a similar selected-response question. An online questionnaire was conducted to collect student feedback. Results. Of the 15 topics, the class scored higher on the constructed-response question for 4 topics and higher on the selected-response question for 10 topics. Eighty percent of the class preferred selected-response questions, although 47.8% felt constructed-response questions better prepared them for a career in healthcare. Conclusions. Students correctly answered more selected-response questions than constructed-response questions and felt more confident in doing so. Additional constructed-response teaching and testing methods should be incorporated into pharmacy education. PMID:23459503

  4. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  5. 23 CFR 635.104 - Method of construction.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... effectiveness is necessary in selecting projects for the design-build delivery method. [56 FR 37004, Aug. 2... 23 Highways 1 2010-04-01 2010-04-01 false Method of construction. 635.104 Section 635.104 Highways... CONSTRUCTION AND MAINTENANCE Contract Procedures § 635.104 Method of construction. (a) Actual construction work...

  6. Measurement versus prediction in the construction of patient-reported outcome questionnaires: can we have our cake and eat it?

    PubMed

    Smits, Niels; van der Ark, L Andries; Conijn, Judith M

    2017-11-02

    Two important goals when using questionnaires are (a) measurement: the questionnaire is constructed to assign numerical values that accurately represent the test taker's attribute, and (b) prediction: the questionnaire is constructed to give an accurate forecast of an external criterion. Construction methods aimed at measurement prescribe that items should be reliable. In practice, this leads to questionnaires with high inter-item correlations. By contrast, construction methods aimed at prediction typically prescribe that items have a high correlation with the criterion and low inter-item correlations. The latter approach has often been said to produce a paradox concerning the relation between reliability and validity [1-3], because it is often assumed that good measurement is a prerequisite of good prediction. To answer four questions: (1) Why are measurement-based methods suboptimal for questionnaires that are used for prediction? (2) How should one construct a questionnaire that is used for prediction? (3) Do questionnaire-construction methods that optimize measurement and prediction lead to the selection of different items in the questionnaire? (4) Is it possible to construct a questionnaire that can be used for both measurement and prediction? An empirical data set consisting of scores of 242 respondents on questionnaire items measuring mental health is used to select items by means of two methods: a method that optimizes the predictive value of the scale (i.e., forecast a clinical diagnosis), and a method that optimizes the reliability of the scale. We show that for the two scales different sets of items are selected and that a scale constructed to meet the one goal does not show optimal performance with reference to the other goal. The answers are as follows: (1) Because measurement-based methods tend to maximize inter-item correlations by which predictive validity reduces. (2) Through selecting items that correlate highly with the criterion and lowly with the remaining items. (3) Yes, these methods may lead to different item selections. (4) For a single questionnaire: Yes, but it is problematic because reliability cannot be estimated accurately. For a test battery: Yes, but it is very costly. Implications for the construction of patient-reported outcome questionnaires are discussed.

  7. Transient dominant host-range selection using Chinese hamster ovary cells to generate marker-free recombinant viral vectors from vaccinia virus.

    PubMed

    Liu, Liang; Cooper, Tamara; Eldi, Preethi; Garcia-Valtanen, Pablo; Diener, Kerrilyn R; Howley, Paul M; Hayball, John D

    2017-04-01

    Recombinant vaccinia viruses (rVACVs) are promising antigen-delivery systems for vaccine development that are also useful as research tools. Two common methods for selection during construction of rVACV clones are (i) co-insertion of drug resistance or reporter protein genes, which requires the use of additional selection drugs or detection methods, and (ii) dominant host-range selection. The latter uses VACV variants rendered replication-incompetent in host cell lines by the deletion of host-range genes. Replicative ability is restored by co-insertion of the host-range genes, providing for dominant selection of the recombinant viruses. Here, we describe a new method for the construction of rVACVs using the cowpox CP77 protein and unmodified VACV as the starting material. Our selection system will expand the range of tools available for positive selection of rVACV during vector construction, and it is substantially more high-fidelity than approaches based on selection for drug resistance.

  8. A Method for Search Engine Selection using Thesaurus for Selective Meta-Search Engine

    NASA Astrophysics Data System (ADS)

    Goto, Shoji; Ozono, Tadachika; Shintani, Toramatsu

    In this paper, we propose a new method for selecting search engines on WWW for selective meta-search engine. In selective meta-search engine, a method is needed that would enable selecting appropriate search engines for users' queries. Most existing methods use statistical data such as document frequency. These methods may select inappropriate search engines if a query contains polysemous words. In this paper, we describe an search engine selection method based on thesaurus. In our method, a thesaurus is constructed from documents in a search engine and is used as a source description of the search engine. The form of a particular thesaurus depends on the documents used for its construction. Our method enables search engine selection by considering relationship between terms and overcomes the problems caused by polysemous words. Further, our method does not have a centralized broker maintaining data, such as document frequency for all search engines. As a result, it is easy to add a new search engine, and meta-search engines become more scalable with our method compared to other existing methods.

  9. Fundamental Vocabulary Selection Based on Word Familiarity

    NASA Astrophysics Data System (ADS)

    Sato, Hiroshi; Kasahara, Kaname; Kanasugi, Tomoko; Amano, Shigeaki

    This paper proposes a new method for selecting fundamental vocabulary. We are presently constructing the Fundamental Vocabulary Knowledge-base of Japanese that contains integrated information on syntax, semantics and pragmatics, for the purposes of advanced natural language processing. This database mainly consists of a lexicon and a treebank: Lexeed (a Japanese Semantic Lexicon) and the Hinoki Treebank. Fundamental vocabulary selection is the first step in the construction of Lexeed. The vocabulary should include sufficient words to describe general concepts for self-expandability, and should not be prohibitively large to construct and maintain. There are two conventional methods for selecting fundamental vocabulary. The first is intuition-based selection by experts. This is the traditional method for making dictionaries. A weak point of this method is that the selection strongly depends on personal intuition. The second is corpus-based selection. This method is superior in objectivity to intuition-based selection, however, it is difficult to compile a sufficiently balanced corpora. We propose a psychologically-motivated selection method that adopts word familiarity as the selection criterion. Word familiarity is a rating that represents the familiarity of a word as a real number ranging from 1 (least familiar) to 7 (most familiar). We determined the word familiarity ratings statistically based on psychological experiments over 32 subjects. We selected about 30,000 words as the fundamental vocabulary, based on a minimum word familiarity threshold of 5. We also evaluated the vocabulary by comparing its word coverage with conventional intuition-based and corpus-based selection over dictionary definition sentences and novels, and demonstrated the superior coverage of our lexicon. Based on this, we conclude that the proposed method is superior to conventional methods for fundamental vocabulary selection.

  10. Creating IRT-Based Parallel Test Forms Using the Genetic Algorithm Method

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Chen, Yu-Jen; Tsai, Shu-Yen; Cheng, Chien-Fen

    2008-01-01

    In educational measurement, the construction of parallel test forms is often a combinatorial optimization problem that involves the time-consuming selection of items to construct tests having approximately the same test information functions (TIFs) and constraints. This article proposes a novel method, genetic algorithm (GA), to construct parallel…

  11. Optimal Item Selection with Credentialing Examinations.

    ERIC Educational Resources Information Center

    Hambleton, Ronald K.; And Others

    The study compared two promising item response theory (IRT) item-selection methods, optimal and content-optimal, with two non-IRT item selection methods, random and classical, for use in fixed-length certification exams. The four methods were used to construct 20-item exams from a pool of approximately 250 items taken from a 1985 certification…

  12. Lunar-base construction equipment and methods evaluation

    NASA Technical Reports Server (NTRS)

    Boles, Walter W.; Ashley, David B.; Tucker, Richard L.

    1993-01-01

    A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.

  13. L-O-S-T: Logging Optimization Selection Technique

    Treesearch

    Jerry L. Koger; Dennis B. Webster

    1984-01-01

    L-O-S-T is a FORTRAN computer program developed to systematically quantify, analyze, and improve user selected harvesting methods. Harvesting times and costs are computed for road construction, landing construction, system move between landings, skidding, and trucking. A linear programming formulation utilizing the relationships among marginal analysis, isoquants, and...

  14. Construction of human antibody gene libraries and selection of antibodies by phage display.

    PubMed

    Frenzel, André; Kügler, Jonas; Wilke, Sonja; Schirrmann, Thomas; Hust, Michael

    2014-01-01

    Antibody phage display is the most commonly used in vitro selection technology and has yielded thousands of useful antibodies for research, diagnostics, and therapy.The prerequisite for successful generation and development of human recombinant antibodies using phage display is the construction of a high-quality antibody gene library. Here, we describe the methods for the construction of human immune and naive scFv gene libraries.The success also depends on the panning strategy for the selection of binders from these libraries. In this article, we describe a panning strategy that is high-throughput compatible and allows parallel selection in microtiter plates.

  15. Analysis of Criteria Influencing Contractor Selection Using TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Alptekin, Orkun; Alptekin, Nesrin

    2017-10-01

    Selection of the most suitable contractor is an important process in public construction projects. This process is a major decision which may influence the progress and success of a construction project. Improper selection of contractors may lead to problems such as bad quality of work and delay in project duration. Especially in the construction projects of public buildings, the proper choice of contractor is beneficial to the public institution. Public procurement processes have different characteristics in respect to dissimilarities in political, social and economic features of every country. In Turkey, Turkish Public Procurement Law PPL 4734 is the main regulatory law for the procurement of the public buildings. According to the PPL 4734, public construction administrators have to contract with the lowest bidder who has the minimum requirements according to the criteria in prequalification process. Public administrators are not sufficient for selection of the proper contractor because of the restrictive provisions of the PPL 4734. The lowest bid method does not enable public construction administrators to select the most qualified contractor and they have realised the fact that the selection of a contractor based on lowest bid alone is inadequate and may lead to the failure of the project in terms of time delay Eand poor quality standards. In order to evaluate the overall efficiency of a project, it is necessary to identify selection criteria. This study aims to focus on identify importance of other criteria besides lowest bid criterion in contractor selection process of PPL 4734. In this study, a survey was conducted to staff of Department of Construction Works of Eskisehir Osmangazi University. According to TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution) for analysis results, termination of construction work in previous tenders is the most important criterion of 12 determined criteria. The lowest bid criterion is ranked in rank 5.

  16. A Feature and Algorithm Selection Method for Improving the Prediction of Protein Structural Class.

    PubMed

    Ni, Qianwu; Chen, Lei

    2017-01-01

    Correct prediction of protein structural class is beneficial to investigation on protein functions, regulations and interactions. In recent years, several computational methods have been proposed in this regard. However, based on various features, it is still a great challenge to select proper classification algorithm and extract essential features to participate in classification. In this study, a feature and algorithm selection method was presented for improving the accuracy of protein structural class prediction. The amino acid compositions and physiochemical features were adopted to represent features and thirty-eight machine learning algorithms collected in Weka were employed. All features were first analyzed by a feature selection method, minimum redundancy maximum relevance (mRMR), producing a feature list. Then, several feature sets were constructed by adding features in the list one by one. For each feature set, thirtyeight algorithms were executed on a dataset, in which proteins were represented by features in the set. The predicted classes yielded by these algorithms and true class of each protein were collected to construct a dataset, which were analyzed by mRMR method, yielding an algorithm list. From the algorithm list, the algorithm was taken one by one to build an ensemble prediction model. Finally, we selected the ensemble prediction model with the best performance as the optimal ensemble prediction model. Experimental results indicate that the constructed model is much superior to models using single algorithm and other models that only adopt feature selection procedure or algorithm selection procedure. The feature selection procedure or algorithm selection procedure are really helpful for building an ensemble prediction model that can yield a better performance. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  17. The method of selecting an integrated development territory for the high-rise unique constructions

    NASA Astrophysics Data System (ADS)

    Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena

    2018-03-01

    On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.

  18. Construct Validity and Scoring Methods of the World Health Organization: Health and Work Performance Questionnaire Among Workers With Arthritis and Rheumatological Conditions.

    PubMed

    AlHeresh, Rawan; LaValley, Michael P; Coster, Wendy; Keysor, Julie J

    2017-06-01

    To evaluate construct validity and scoring methods of the world health organization-health and work performance questionnaire (HPQ) for people with arthritis. Construct validity was examined through hypothesis testing using the recommended guidelines of the consensus-based standards for the selection of health measurement instruments (COSMIN). The HPQ using the absolute scoring method showed moderate construct validity as four of the seven hypotheses were met. The HPQ using the relative scoring method had weak construct validity as only one of the seven hypotheses were met. The absolute scoring method for the HPQ is superior in construct validity to the relative scoring method in assessing work performance among people with arthritis and related rheumatic conditions; however, more research is needed to further explore other psychometric properties of the HPQ.

  19. Analysis of Vibration and Noise of Construction Machinery Based on Ensemble Empirical Mode Decomposition and Spectral Correlation Analysis Method

    NASA Astrophysics Data System (ADS)

    Chen, Yuebiao; Zhou, Yiqi; Yu, Gang; Lu, Dan

    In order to analyze the effect of engine vibration on cab noise of construction machinery in multi-frequency bands, a new method based on ensemble empirical mode decomposition (EEMD) and spectral correlation analysis is proposed. Firstly, the intrinsic mode functions (IMFs) of vibration and noise signals were obtained by EEMD method, and then the IMFs which have the same frequency bands were selected. Secondly, we calculated the spectral correlation coefficients between the selected IMFs, getting the main frequency bands in which engine vibration has significant impact on cab noise. Thirdly, the dominated frequencies were picked out and analyzed by spectral analysis method. The study result shows that the main frequency bands and dominated frequencies in which engine vibration have serious impact on cab noise can be identified effectively by the proposed method, which provides effective guidance to noise reduction of construction machinery.

  20. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    PubMed

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method of assessing and reporting whether items assess the intended theoretical construct and only that construct. In three studies, DCV was applied to measures of illness perceptions, control cognitions, and theory of planned behaviour response formats. Appendix S1 gives content validity indices for each item of each questionnaire investigated. Discriminant content validity is ideally applied while the measure is being developed, before using to measure the construct(s), but can also be applied after using a measure. © 2014 The British Psychological Society.

  1. Comparisons of topological properties in autism for the brain network construction methods

    NASA Astrophysics Data System (ADS)

    Lee, Min-Hee; Kim, Dong Youn; Lee, Sang Hyeon; Kim, Jin Uk; Chung, Moo K.

    2015-03-01

    Structural brain networks can be constructed from the white matter fiber tractography of diffusion tensor imaging (DTI), and the structural characteristics of the brain can be analyzed from its networks. When brain networks are constructed by the parcellation method, their network structures change according to the parcellation scale selection and arbitrary thresholding. To overcome these issues, we modified the Ɛ -neighbor construction method proposed by Chung et al. (2011). The purpose of this study was to construct brain networks for 14 control subjects and 16 subjects with autism using both the parcellation and the Ɛ-neighbor construction method and to compare their topological properties between two methods. As the number of nodes increased, connectedness decreased in the parcellation method. However in the Ɛ-neighbor construction method, connectedness remained at a high level even with the rising number of nodes. In addition, statistical analysis for the parcellation method showed significant difference only in the path length. However, statistical analysis for the Ɛ-neighbor construction method showed significant difference with the path length, the degree and the density.

  2. Selecting the Right Construction Delivery Method for a Specific Project.

    ERIC Educational Resources Information Center

    Klinger, Jeff; Booth, Scott

    2002-01-01

    Discusses the costs and benefits of various construction delivery methods for higher education facility projects, including the traditional lump sum general contracting approach (also known as design/bid/build); design-build; and, in the case of private institutions, guaranteed maximum pricing offered by those firms willing to perform construction…

  3. Simple Method for Markerless Gene Deletion in Multidrug-Resistant Acinetobacter baumannii

    PubMed Central

    Oh, Man Hwan; Lee, Je Chul; Kim, Jungmin

    2015-01-01

    The traditional markerless gene deletion technique based on overlap extension PCR has been used for generating gene deletions in multidrug-resistant Acinetobacter baumannii. However, the method is time-consuming because it requires restriction digestion of the PCR products in DNA cloning and the construction of new vectors containing a suitable antibiotic resistance cassette for the selection of A. baumannii merodiploids. Moreover, the availability of restriction sites and the selection of recombinant bacteria harboring the desired chimeric plasmid are limited, making the construction of a chimeric plasmid more difficult. We describe a rapid and easy cloning method for markerless gene deletion in A. baumannii, which has no limitation in the availability of restriction sites and allows for easy selection of the clones carrying the desired chimeric plasmid. Notably, it is not necessary to construct new vectors in our method. This method utilizes direct cloning of blunt-end DNA fragments, in which upstream and downstream regions of the target gene are fused with an antibiotic resistance cassette via overlap extension PCR and are inserted into a blunt-end suicide vector developed for blunt-end cloning. Importantly, the antibiotic resistance cassette is placed outside the downstream region in order to enable easy selection of the recombinants carrying the desired plasmid, to eliminate the antibiotic resistance cassette via homologous recombination, and to avoid the necessity of constructing new vectors. This strategy was successfully applied to functional analysis of the genes associated with iron acquisition by A. baumannii ATCC 19606 and to ompA gene deletion in other A. baumannii strains. Consequently, the proposed method is invaluable for markerless gene deletion in multidrug-resistant A. baumannii. PMID:25746991

  4. Potential of Progressive Construction Systems in Slovakia

    NASA Astrophysics Data System (ADS)

    Kozlovska, Maria; Spisakova, Marcela; Mackova, Daniela

    2017-10-01

    Construction industry is a sector with rapid development. Progressive technologies of construction and new construction materials also called modern methods of construction (MMC) are developed constantly. MMC represent the adoption of construction industrialisation and the use of prefabrication of components in building construction. One of these modern methods is also system Varianthaus, which is based on, insulated concrete forms principle and provides complete production plant for wall, ceiling and roof elements for a high thermal insulation house construction. Another progressive construction system is EcoB, which represents an insulated precast concrete panel based on combination of two layers, insulation and concrete, produced in a factory as a whole. Both modern methods of construction are not yet known and wide-spread in the Slovak construction market. The aim of this paper is focused on demonstration of MMC using potential in Slovakia. MMC potential is proved based on comparison of the selected parameters of construction process - construction costs and construction time. The subject of this study is family house modelled in three material variants - masonry construction (as a representative of traditional methods of construction), Varianthaus and EcoB (as the representatives of modern methods of construction). The results of this study provide the useful information in decision-making process for potential investors of construction.

  5. Guidelines for Selecting a Construction Approach for Education Building Programs.

    ERIC Educational Resources Information Center

    Barton Malow Co., Southfield, MI.

    This book discusses the advantages and disadvantages of the two most common construction planning methods utilized for educational facilities: general contracting and construction management. Diagrams are provided that illustrate the chain of command and communication within each approach, and highlight considerations that every school district…

  6. Construction of robust prognostic predictors by using projective adaptive resonance theory as a gene filtering method.

    PubMed

    Takahashi, Hiro; Kobayashi, Takeshi; Honda, Hiroyuki

    2005-01-15

    For establishing prognostic predictors of various diseases using DNA microarray analysis technology, it is desired to find selectively significant genes for constructing the prognostic model and it is also necessary to eliminate non-specific genes or genes with error before constructing the model. We applied projective adaptive resonance theory (PART) to gene screening for DNA microarray data. Genes selected by PART were subjected to our FNN-SWEEP modeling method for the construction of a cancer class prediction model. The model performance was evaluated through comparison with a conventional screening signal-to-noise (S2N) method or nearest shrunken centroids (NSC) method. The FNN-SWEEP predictor with PART screening could discriminate classes of acute leukemia in blinded data with 97.1% accuracy and classes of lung cancer with 90.0% accuracy, while the predictor with S2N was only 85.3 and 70.0% or the predictor with NSC was 88.2 and 90.0%, respectively. The results have proven that PART was superior for gene screening. The software is available upon request from the authors. honda@nubio.nagoya-u.ac.jp

  7. Compilation of a near-infrared library for the construction of quantitative models of amoxicillin and potassium clavulanate oral dosage forms

    NASA Astrophysics Data System (ADS)

    Zou, Wen-bo; Chong, Xiao-meng; Wang, Yan; Hu, Chang-qin

    2018-05-01

    The accuracy of NIR quantitative models depends on calibration samples with concentration variability. Conventional sample collecting methods have some shortcomings especially the time-consuming which remains a bottleneck in the application of NIR models for Process Analytical Technology (PAT) control. A study was performed to solve the problem of sample selection collection for construction of NIR quantitative models. Amoxicillin and potassium clavulanate oral dosage forms were used as examples. The aim was to find a normal approach to rapidly construct NIR quantitative models using an NIR spectral library based on the idea of a universal model [2021]. The NIR spectral library of amoxicillin and potassium clavulanate oral dosage forms was defined and consisted of spectra of 377 batches of samples produced by 26 domestic pharmaceutical companies, including tablets, dispersible tablets, chewable tablets, oral suspensions, and granules. The correlation coefficient (rT) was used to indicate the similarities of the spectra. The samples’ calibration sets were selected from a spectral library according to the median rT of the samples to be analyzed. The rT of the samples selected was close to the median rT. The difference in rT of those samples was 1.0% to 1.5%. We concluded that sample selection is not a problem when constructing NIR quantitative models using a spectral library versus conventional methods of determining universal models. The sample spectra with a suitable concentration range in the NIR models were collected quickly. In addition, the models constructed through this method were more easily targeted.

  8. Construction of Response Surface with Higher Order Continuity and Its Application to Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, T.; Romero, V. J.

    2002-01-01

    The usefulness of piecewise polynomials with C1 and C2 derivative continuity for response surface construction method is examined. A Moving Least Squares (MLS) method is developed and compared with four other interpolation methods, including kriging. First the selected methods are applied and compared with one another in a two-design variables problem with a known theoretical response function. Next the methods are tested in a four-design variables problem from a reliability-based design application. In general the piecewise polynomial with higher order derivative continuity methods produce less error in the response prediction. The MLS method was found to be superior for response surface construction among the methods evaluated.

  9. One of the criteria for selecting a contractor for high-rise construction

    NASA Astrophysics Data System (ADS)

    Tuskaeva, Zalina; Tagirov, Timur

    2018-03-01

    The mechanisms for management of the building complex used and proposed to date do not always provide the required result in the assessment of the construction organization facilities. Therefore, the development of new effective methods for such an assessment is an urgent task especially in questions related to high-rise construction. The article formally sets the task of assessing the technical facilities of a construction organization. Due to the use of expert methods, the weighted values of the coefficients of local indicators for technical facilities are identified

  10. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    NASA Astrophysics Data System (ADS)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  11. Application of Multilabel Learning Using the Relevant Feature for Each Label in Chronic Gastritis Syndrome Diagnosis

    PubMed Central

    Liu, Guo-Ping; Yan, Jian-Jun; Wang, Yi-Qin; Fu, Jing-Jing; Xu, Zhao-Xia; Guo, Rui; Qian, Peng

    2012-01-01

    Background. In Traditional Chinese Medicine (TCM), most of the algorithms are used to solve problems of syndrome diagnosis that only focus on one syndrome, that is, single label learning. However, in clinical practice, patients may simultaneously have more than one syndrome, which has its own symptoms (signs). Methods. We employed a multilabel learning using the relevant feature for each label (REAL) algorithm to construct a syndrome diagnostic model for chronic gastritis (CG) in TCM. REAL combines feature selection methods to select the significant symptoms (signs) of CG. The method was tested on 919 patients using the standard scale. Results. The highest prediction accuracy was achieved when 20 features were selected. The features selected with the information gain were more consistent with the TCM theory. The lowest average accuracy was 54% using multi-label neural networks (BP-MLL), whereas the highest was 82% using REAL for constructing the diagnostic model. For coverage, hamming loss, and ranking loss, the values obtained using the REAL algorithm were the lowest at 0.160, 0.142, and 0.177, respectively. Conclusion. REAL extracts the relevant symptoms (signs) for each syndrome and improves its recognition accuracy. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice. PMID:22719781

  12. Feature selection gait-based gender classification under different circumstances

    NASA Astrophysics Data System (ADS)

    Sabir, Azhin; Al-Jawad, Naseer; Jassim, Sabah

    2014-05-01

    This paper proposes a gender classification based on human gait features and investigates the problem of two variations: clothing (wearing coats) and carrying bag condition as addition to the normal gait sequence. The feature vectors in the proposed system are constructed after applying wavelet transform. Three different sets of feature are proposed in this method. First, Spatio-temporal distance that is dealing with the distance of different parts of the human body (like feet, knees, hand, Human Height and shoulder) during one gait cycle. The second and third feature sets are constructed from approximation and non-approximation coefficient of human body respectively. To extract these two sets of feature we divided the human body into two parts, upper and lower body part, based on the golden ratio proportion. In this paper, we have adopted a statistical method for constructing the feature vector from the above sets. The dimension of the constructed feature vector is reduced based on the Fisher score as a feature selection method to optimize their discriminating significance. Finally k-Nearest Neighbor is applied as a classification method. Experimental results demonstrate that our approach is providing more realistic scenario and relatively better performance compared with the existing approaches.

  13. Research on the selection of innovation compound using Possibility Construction Space Theory and fuzzy pattern recognition

    NASA Astrophysics Data System (ADS)

    Xie, Songhua; Li, Dehua; Nie, Hui

    2009-10-01

    There are a large number of fuzzy concepts and fuzzy phenomena in traditional Chinese medicine, which have led to great difficulties for study of traditional Chinese medicine. In this paper, the mathematical methods are used to quantify fuzzy concepts of drugs and prescription. We put forward the process of innovation formulations and selection method in Chinese medicine based on the Possibility Construction Space Theory (PCST) and fuzzy pattern recognition. Experimental results show that the method of selecting medicines from a number of characteristics of traditional Chinese medicine is consistent with the basic theory of traditional Chinese medicine. The results also reflect the integrated effects of the innovation compound. Through the use of the innovation formulations system, we expect to provide software tools for developing new traditional Chinese medicine and to inspire traditional Chinese medicine researchers to develop novel drugs.

  14. An Analysis of Cost Premiums and Losses Associated with USAF Military Construction (MILCON)

    DTIC Science & Technology

    2013-03-01

    Corps of Engineers (USACE) or the Naval Facilities Engineering Command (NAVFAC) for design and construction of the annual military construction...AFCEC in lieu of AFCEE or AFCESA. The selection, and policies, of design and construction agents may cause cost premiums for Air Force MILCON...private industry best practices such as relational contracting, schedule performance, and design -build procurement methods. All of these research

  15. Cost containment and KSC Shuttle facilities or cost containment and aerospace construction

    NASA Technical Reports Server (NTRS)

    Brown, J. A.

    1985-01-01

    This presentation has the objective to show examples of Cost Containment of Aerospace Construction at Kennedy Space Center (KSC), taking into account four major levels of Project Development of the Space Shuttle Facilities. The levels are related to conceptual criteria and site selection, the design of construction and ground support equipment, the construction of facilities and ground support equipment (GSE), and operation and maintenance. Examples of cost containment are discussed. The continued reduction of processing time from landing to launching represents a demonstration of the success of the cost containment methods. Attention is given to the factors which led to the selection of KSC, the use of Cost Engineering, the employment of the Construction Management Concept, and the use of Computer Aided Design/Drafting.

  16. A new algorithm to construct phylogenetic networks from trees.

    PubMed

    Wang, J

    2014-03-06

    Developing appropriate methods for constructing phylogenetic networks from tree sets is an important problem, and much research is currently being undertaken in this area. BIMLR is an algorithm that constructs phylogenetic networks from tree sets. The algorithm can construct a much simpler network than other available methods. Here, we introduce an improved version of the BIMLR algorithm, QuickCass. QuickCass changes the selection strategy of the labels of leaves below the reticulate nodes, i.e., the nodes with an indegree of at least 2 in BIMLR. We show that QuickCass can construct simpler phylogenetic networks than BIMLR. Furthermore, we show that QuickCass is a polynomial-time algorithm when the output network that is constructed by QuickCass is binary.

  17. Recognition method of construction conflict based on driver's eye movement.

    PubMed

    Xu, Yi; Li, Shiwu; Gao, Song; Tan, Derong; Guo, Dong; Wang, Yuqiong

    2018-04-01

    Drivers eye movement data in simulated construction conflicts at different speeds were collected and analyzed to find the relationship between the drivers' eye movement and the construction conflict. On the basis of the relationship between the drivers' eye movement and the construction conflict, the peak point of wavelet processed pupil diameter, the first point on the left side of the peak point and the first blink point after the peak point are selected as key points for locating construction conflict periods. On the basis of the key points and the GSA, a construction conflict recognition method so called the CCFRM is proposed. And the construction conflict recognition speed and location accuracy of the CCFRM are verified. The good performance of the CCFRM verified the feasibility of proposed key points in construction conflict recognition. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  19. Establishing an efficient way to utilize the drought resistance germplasm population in wheat.

    PubMed

    Wang, Jiancheng; Guan, Yajing; Wang, Yang; Zhu, Liwei; Wang, Qitian; Hu, Qijuan; Hu, Jin

    2013-01-01

    Drought resistance breeding provides a hopeful way to improve yield and quality of wheat in arid and semiarid regions. Constructing core collection is an efficient way to evaluate and utilize drought-resistant germplasm resources in wheat. In the present research, 1,683 wheat varieties were divided into five germplasm groups (high resistant, HR; resistant, R; moderate resistant, MR; susceptible, S; and high susceptible, HS). The least distance stepwise sampling (LDSS) method was adopted to select core accessions. Six commonly used genetic distances (Euclidean distance, Euclid; Standardized Euclidean distance, Seuclid; Mahalanobis distance, Mahal; Manhattan distance, Manhat; Cosine distance, Cosine; and Correlation distance, Correlation) were used to assess genetic distances among accessions. Unweighted pair-group average (UPGMA) method was used to perform hierarchical cluster analysis. Coincidence rate of range (CR) and variable rate of coefficient of variation (VR) were adopted to evaluate the representativeness of the core collection. A method for selecting the ideal constructing strategy was suggested in the present research. A wheat core collection for the drought resistance breeding programs was constructed by the strategy selected in the present research. The principal component analysis showed that the genetic diversity was well preserved in that core collection.

  20. Social construction of the patient through problems of safety, uninsurance, and unequal treatment.

    PubMed

    Trigg, Lisa J

    2009-01-01

    The purpose of this research was to study how the Institute of Medicine discourse promoting health information technology may reproduce existing social inequalities in healthcare. Social constructionist and critical discourse analysis combined with corpus linguistics methods have been used to study the subject positions constructed for receivers of healthcare across the executive summaries of 3 different Institute of Medicine reports. Data analysis revealed differences in the way receivers of healthcare are constructed through variations of social action through language use in the 3 texts selected for this method's testing.

  1. Comparison of Housing Construction Development in Selected Regions of Central Europe

    NASA Astrophysics Data System (ADS)

    Dvorský, Ján; Petráková, Zora; Hollý, Ján

    2017-12-01

    In fast-growing countries, the economic growth, which came after the global financial crisis, ought to be manifested in the development of housing policy. The development of the region is directly related to the increase of the quality of living of its inhabitants. Housing construction and its relation with the availability of housing is a key issue for population overall. Comparison of its development in selected regions is important for experts in the field of construction, mayors of the regions, the state, but especially for the inhabitants themselves. The aim of the article is to compare the number of new dwellings with building permits and completed dwellings with final building approval between selected regions by using a mathematical statistics method - “Analysis of variance”. The article also uses the tools of descriptive statistics such as a point graph, a graph of deviations from the average, basic statistical characteristics of mean and variability. Qualitative factors influencing the construction of flats as well as the causes of quantitative differences in the number of started apartments under construction and completed apartments in selected regions of Central Europe are the subjects of the article’s conclusions.

  2. Selection of suitable NDT methods for building inspection

    NASA Astrophysics Data System (ADS)

    Pauzi Ismail, Mohamad

    2017-11-01

    Construction of modern structures requires good quality concrete with adequate strength and durability. Several accidents occurred in the civil constructions and were reported in the media. Such accidents were due to poor workmanship and lack of systematic monitoring during the constructions. In addition, water leaking and cracking in residential houses was commonly reported too. Based on these facts, monitoring the quality of concrete in structures is becoming more and more important subject. This paper describes major Non-destructive Testing (NDT) methods for evaluating structural integrity of concrete building. Some interesting findings during actual NDT inspections on site are presented. The NDT methods used are explained, compared and discussed. The suitable methods are suggested as minimum NDT methods to cover parameters required in the inspection.

  3. Selection of adequate site location during early stages of construction project management: A multi-criteria decision analysis approach

    NASA Astrophysics Data System (ADS)

    Marović, Ivan; Hanak, Tomaš

    2017-10-01

    In the management of construction projects special attention should be given to the planning as the most important phase of decision-making process. Quality decision-making based on adequate and comprehensive collaboration of all involved stakeholders is crucial in project’s early stages. Fundamental reasons for existence of this problem arise from: specific conditions of construction industry (final products are inseparable from the location i.e. location has a strong influence of building design and its structural characteristics as well as technology which will be used during construction), investors’ desires and attitudes, and influence of socioeconomic and environment aspects. Considering all mentioned reasons one can conclude that selection of adequate construction site location for future investment is complex, low structured and multi-criteria problem. To take into account all the dimensions, the proposed model for selection of adequate site location is devised. The model is based on AHP (for designing the decision-making hierarchy) and PROMETHEE (for pairwise comparison of investment locations) methods. As a result of mixing basis feature of both methods, operational synergies can be achieved in multi-criteria decision analysis. Such gives the decision-maker a sense of assurance, knowing that if the procedure proposed by the presented model has been followed, it will lead to a rational decision, carefully and systematically thought out.

  4. Improvement of the material and transport component of the system of construction waste management

    NASA Astrophysics Data System (ADS)

    Kostyshak, Mikhail; Lunyakov, Mikhail

    2017-10-01

    Relevance of the topic of selected research is conditioned with the growth of construction operations and growth rates of construction and demolition wastes. This article considers modern approaches to the management of turnover of construction waste, sequence of reconstruction or demolition processes of the building, information flow of the complete cycle of turnover of construction and demolition waste, methods for improvement of the material and transport component of the construction waste management system. Performed analysis showed that mechanism of management of construction waste allows to increase efficiency and environmental safety of this branch and regions.

  5. Long-term behavior of integral abutment bridges : appendix E, INDOT design manual : selected recommendations for integral abutment bridges.

    DOT National Transportation Integrated Search

    2011-01-01

    Integral abutment (IA) construction has become the preferred method over conventional construction for use with typical highway bridges. However, the use of these structures is limited due to state mandated length and skew limitations. To expand thei...

  6. An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.

    ERIC Educational Resources Information Center

    Kay, Robin

    1992-01-01

    Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…

  7. Bacteriophage vehicles for phage display: biology, mechanism, and application.

    PubMed

    Ebrahimizadeh, Walead; Rajabibazl, Masoumeh

    2014-08-01

    The phage display technique is a powerful tool for selection of various biological agents. This technique allows construction of large libraries from the antibody repertoire of different hosts and provides a fast and high-throughput selection method. Specific antibodies can be isolated based on distinctive characteristics from a library consisting of millions of members. These features made phage display technology preferred method for antibody selection and engineering. There are several phage display methods available and each has its unique merits and application. Selection of appropriate display technique requires basic knowledge of available methods and their mechanism. In this review, we describe different phage display techniques, available bacteriophage vehicles, and their mechanism.

  8. Rapid construction of a Bacterial Artificial Chromosomal (BAC) expression vector using designer DNA fragments.

    PubMed

    Chen, Chao; Zhao, Xinqing; Jin, Yingyu; Zhao, Zongbao Kent; Suh, Joo-Won

    2014-11-01

    Bacterial artificial chromosomal (BAC) vectors are increasingly being used in cloning large DNA fragments containing complex biosynthetic pathways to facilitate heterologous production of microbial metabolites for drug development. To express inserted genes using Streptomyces species as the production hosts, an integration expression cassette is required to be inserted into the BAC vector, which includes genetic elements encoding a phage-specific attachment site, an integrase, an origin of transfer, a selection marker and a promoter. Due to the large sizes of DNA inserted into the BAC vectors, it is normally inefficient and time-consuming to assemble these fragments by routine PCR amplifications and restriction-ligations. Here we present a rapid method to insert fragments to construct BAC-based expression vectors. A DNA fragment of about 130 bp was designed, which contains upstream and downstream homologous sequences of both BAC vector and pIB139 plasmid carrying the whole integration expression cassette. In-Fusion cloning was performed using the designer DNA fragment to modify pIB139, followed by λ-RED-mediated recombination to obtain the BAC-based expression vector. We demonstrated the effectiveness of this method by rapid construction of a BAC-based expression vector with an insert of about 120 kb that contains the entire gene cluster for biosynthesis of immunosuppressant FK506. The empty BAC-based expression vector constructed in this study can be conveniently used for construction of BAC libraries using either microbial pure culture or environmental DNA, and the selected BAC clones can be directly used for heterologous expression. Alternatively, if a BAC library has already been constructed using a commercial BAC vector, the selected BAC vectors can be manipulated using the method described here to get the BAC-based expression vectors with desired gene clusters for heterologous expression. The rapid construction of a BAC-based expression vector facilitates heterologous expression of large gene clusters for drug discovery. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR)

    PubMed Central

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-01-01

    Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100

  10. Construction and evaluation of ion selective electrodes for nitrate with a summing operational amplifier. Application to tobacco analysis.

    PubMed

    Pérez-Olmos, R; Rios, A; Fernández, J R; Lapa, R A; Lima, J L

    2001-01-05

    In this paper, the construction and evaluation of an electrode selective to nitrate with improved sensitivity, constructed like a conventional electrode (ISE) but using an operational amplifier to sum the potentials supplied by four membranes (ESOA) is described. The two types of electrodes, without an inner reference solution, were constructed using tetraoctylammonium bromide as sensor, dibutylphthalate as solvent mediator and PVC as plastic matrix, the membranes obtained directly applied onto a conductive epoxy resin support. After the comparative evaluation of their working characteristics they were used in the determination of nitrate in different types of tobacco. The limit of detection of the direct potentiometric method developed was found to be 0.18 g kg(-1) and the precision and accuracy of the method, when applied to eight different samples of tobacco, expressed in terms of mean R.S.D. and average percentage of spike recovery was 0.6 and 100.3%, respectively. The comparison of variances showed, on all ocassions, that the results obtained by the ESOA were similar to those obtained by the conventional ISE, but with higher precision. Linear regression analysis showed good agreement (r=0.9994) between the results obtained by the developed potentiometric method and those of a spectrophotometric method based on brucine, adopted as reference method, when applied simultaneously to 32 samples of different types of tobacco.

  11. Using multiple classifiers for predicting the risk of endovascular aortic aneurysm repair re-intervention through hybrid feature selection.

    PubMed

    Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter Je; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong

    2017-11-01

    Feature selection is essential in medical area; however, its process becomes complicated with the presence of censoring which is the unique character of survival analysis. Most survival feature selection methods are based on Cox's proportional hazard model, though machine learning classifiers are preferred. They are less employed in survival analysis due to censoring which prevents them from directly being used to survival data. Among the few work that employed machine learning classifiers, partial logistic artificial neural network with auto-relevance determination is a well-known method that deals with censoring and perform feature selection for survival data. However, it depends on data replication to handle censoring which leads to unbalanced and biased prediction results especially in highly censored data. Other methods cannot deal with high censoring. Therefore, in this article, a new hybrid feature selection method is proposed which presents a solution to high level censoring. It combines support vector machine, neural network, and K-nearest neighbor classifiers using simple majority voting and a new weighted majority voting method based on survival metric to construct a multiple classifier system. The new hybrid feature selection process uses multiple classifier system as a wrapper method and merges it with iterated feature ranking filter method to further reduce features. Two endovascular aortic repair datasets containing 91% censored patients collected from two centers were used to construct a multicenter study to evaluate the performance of the proposed approach. The results showed the proposed technique outperformed individual classifiers and variable selection methods based on Cox's model such as Akaike and Bayesian information criterions and least absolute shrinkage and selector operator in p values of the log-rank test, sensitivity, and concordance index. This indicates that the proposed classifier is more powerful in correctly predicting the risk of re-intervention enabling doctor in selecting patients' future follow-up plan.

  12. Space-planning and structural solutions of low-rise buildings: Optimal selection methods

    NASA Astrophysics Data System (ADS)

    Gusakova, Natalya; Minaev, Nikolay; Filushina, Kristina; Dobrynina, Olga; Gusakov, Alexander

    2017-11-01

    The present study is devoted to elaboration of methodology used to select appropriately the space-planning and structural solutions in low-rise buildings. Objective of the study is working out the system of criteria influencing the selection of space-planning and structural solutions which are most suitable for low-rise buildings and structures. Application of the defined criteria in practice aim to enhance the efficiency of capital investments, energy and resource saving, create comfortable conditions for the population considering climatic zoning of the construction site. Developments of the project can be applied while implementing investment-construction projects of low-rise housing at different kinds of territories based on the local building materials. The system of criteria influencing the optimal selection of space-planning and structural solutions of low-rise buildings has been developed. Methodological basis has been also elaborated to assess optimal selection of space-planning and structural solutions of low-rise buildings satisfying the requirements of energy-efficiency, comfort and safety, and economical efficiency. Elaborated methodology enables to intensify the processes of low-rise construction development for different types of territories taking into account climatic zoning of the construction site. Stimulation of low-rise construction processes should be based on the system of approaches which are scientifically justified; thus it allows enhancing energy efficiency, comfort, safety and economical effectiveness of low-rise buildings.

  13. Methodology for Developing and Evaluating the PROMIS® Smoking Item Banks

    PubMed Central

    Cai, Li; Stucky, Brian D.; Tucker, Joan S.; Shadel, William G.; Edelen, Maria Orlando

    2014-01-01

    Introduction: This article describes the procedures used in the PROMIS® Smoking Initiative for the development and evaluation of item banks, short forms (SFs), and computerized adaptive tests (CATs) for the assessment of 6 constructs related to cigarette smoking: nicotine dependence, coping expectancies, emotional and sensory expectancies, health expectancies, psychosocial expectancies, and social motivations for smoking. Methods: Analyses were conducted using response data from a large national sample of smokers. Items related to each construct were subjected to extensive item factor analyses and evaluation of differential item functioning (DIF). Final item banks were calibrated, and SF assessments were developed for each construct. The performance of the SFs and the potential use of the item banks for CAT administration were examined through simulation study. Results: Item selection based on dimensionality assessment and DIF analyses produced item banks that were essentially unidimensional in structure and free of bias. Simulation studies demonstrated that the constructs could be accurately measured with a relatively small number of carefully selected items, either through fixed SFs or CAT-based assessment. Illustrative results are presented, and subsequent articles provide detailed discussion of each item bank in turn. Conclusions: The development of the PROMIS smoking item banks provides researchers with new tools for measuring smoking-related constructs. The use of the calibrated item banks and suggested SF assessments will enhance the quality of score estimates, thus advancing smoking research. Moreover, the methods used in the current study, including innovative approaches to item selection and SF construction, may have general relevance to item bank development and evaluation. PMID:23943843

  14. Module-based construction of plasmids for chromosomal integration of the fission yeast Schizosaccharomyces pombe

    PubMed Central

    Kakui, Yasutaka; Sunaga, Tomonari; Arai, Kunio; Dodgson, James; Ji, Liang; Csikász-Nagy, Attila; Carazo-Salas, Rafael; Sato, Masamitsu

    2015-01-01

    Integration of an external gene into a fission yeast chromosome is useful to investigate the effect of the gene product. An easy way to knock-in a gene construct is use of an integration plasmid, which can be targeted and inserted to a chromosome through homologous recombination. Despite the advantage of integration, construction of integration plasmids is energy- and time-consuming, because there is no systematic library of integration plasmids with various promoters, fluorescent protein tags, terminators and selection markers; therefore, researchers are often forced to make appropriate ones through multiple rounds of cloning procedures. Here, we establish materials and methods to easily construct integration plasmids. We introduce a convenient cloning system based on Golden Gate DNA shuffling, which enables the connection of multiple DNA fragments at once: any kind of promoters and terminators, the gene of interest, in combination with any fluorescent protein tag genes and any selection markers. Each of those DNA fragments, called a ‘module’, can be tandemly ligated in the order we desire in a single reaction, which yields a circular plasmid in a one-step manner. The resulting plasmids can be integrated through standard methods for transformation. Thus, these materials and methods help easy construction of knock-in strains, and this will further increase the value of fission yeast as a model organism. PMID:26108218

  15. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    PubMed

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.

  16. Identification of Genetic Loci Underlying the Phenotypic Constructs of Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Liu, Xiao-Qing; Georgiades, Stelios; Duku, Eric; Thompson, Ann; Devlin, Bernie; Cook, Edwin H.; Wijsman, Ellen M.; Paterson, Andrew D.; Szatmari, Peter

    2011-01-01

    Objective: To investigate the underlying phenotypic constructs in autism spectrum disorders (ASD) and to identify genetic loci that are linked to these empirically derived factors. Method: Exploratory factor analysis was applied to two datasets with 28 selected Autism Diagnostic Interview-Revised (ADI-R) algorithm items. The first dataset was from…

  17. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  18. The Causal Meaning of Genomic Predictors and How It Affects Construction and Comparison of Genome-Enabled Selection Models

    PubMed Central

    Valente, Bruno D.; Morota, Gota; Peñagaricano, Francisco; Gianola, Daniel; Weigel, Kent; Rosa, Guilherme J. M.

    2015-01-01

    The term “effect” in additive genetic effect suggests a causal meaning. However, inferences of such quantities for selection purposes are typically viewed and conducted as a prediction task. Predictive ability as tested by cross-validation is currently the most acceptable criterion for comparing models and evaluating new methodologies. Nevertheless, it does not directly indicate if predictors reflect causal effects. Such evaluations would require causal inference methods that are not typical in genomic prediction for selection. This suggests that the usual approach to infer genetic effects contradicts the label of the quantity inferred. Here we investigate if genomic predictors for selection should be treated as standard predictors or if they must reflect a causal effect to be useful, requiring causal inference methods. Conducting the analysis as a prediction or as a causal inference task affects, for example, how covariates of the regression model are chosen, which may heavily affect the magnitude of genomic predictors and therefore selection decisions. We demonstrate that selection requires learning causal genetic effects. However, genomic predictors from some models might capture noncausal signal, providing good predictive ability but poorly representing true genetic effects. Simulated examples are used to show that aiming for predictive ability may lead to poor modeling decisions, while causal inference approaches may guide the construction of regression models that better infer the target genetic effect even when they underperform in cross-validation tests. In conclusion, genomic selection models should be constructed to aim primarily for identifiability of causal genetic effects, not for predictive ability. PMID:25908318

  19. A sampling and classification item selection approach with content balancing.

    PubMed

    Chen, Pei-Hua

    2015-03-01

    Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.

  20. Development of Elderly Quality of Life Index – Eqoli: Item Reduction and Distribution into Dimensions

    PubMed Central

    Paschoal, Sérgio Márcio Pacheco; Filho, Wilson Jacob; Litvoc, Júlio

    2008-01-01

    OBJECTIVE To describe item reduction and its distribution into dimensions in the construction process of a quality of life evaluation instrument for the elderly. METHODS The sampling method was chosen by convenience through quotas, with selection of elderly subjects from four programs to achieve heterogeneity in the “health status”, “functional capacity”, “gender”, and “age” variables. The Clinical Impact Method was used, consisting of the spontaneous and elicited selection by the respondents of relevant items to the construct Quality of Life in Old Age from a previously elaborated item pool. The respondents rated each item’s importance using a 5-point Likert scale. The product of the proportion of elderly selecting the item as relevant (frequency) and the mean importance score they attributed to it (importance) represented the overall impact of that item in their quality of life (impact). The items were ordered according to their impact scores and the top 46 scoring items were grouped in dimensions by three experts. A review of the negative items was performed. RESULTS One hundred and ninety three people (122 women and 71 men) were interviewed. Experts distributed the 46 items into eight dimensions. Closely related items were grouped and dimensions not reaching the minimum expected number of items received additional items resulting in eight dimensions and 43 items. DISCUSSION The sample was heterogeneous and similar to what was expected. The dimensions and items demonstrated the multidimensionality of the construct. The Clinical Impact Method was appropriate to construct the instrument, which was named Elderly Quality of Life Index - EQoLI. An accuracy process will be examined in the future. PMID:18438571

  1. Method of making a composite tube to metal joint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leslie, James C.; Leslie, II, James C.; Heard, James

    A method for making a metal to composite tube joint including selecting an elongated interior fitting constructed with an exterior barrel, reduced in exterior diameter to form a distally facing annular shoulder and then projecting still further distally to form an interior sleeve having a radially outwardly facing bonding surface. Selecting an elongated metal outer sleeve formed proximally with a collar constructed for receipt over the barrel and increased in interior diameter and projecting distally to form an exterior sleeve having a radially inwardly facing bonding surface cooperating with the first bonding surface to form an annulus receiving an extremitymore » of a composite tube and a bond bonding the extremity of the tube to the bonding surfaces.« less

  2. Finding minimum gene subsets with heuristic breadth-first search algorithm for robust tumor classification

    PubMed Central

    2012-01-01

    Background Previous studies on tumor classification based on gene expression profiles suggest that gene selection plays a key role in improving the classification performance. Moreover, finding important tumor-related genes with the highest accuracy is a very important task because these genes might serve as tumor biomarkers, which is of great benefit to not only tumor molecular diagnosis but also drug development. Results This paper proposes a novel gene selection method with rich biomedical meaning based on Heuristic Breadth-first Search Algorithm (HBSA) to find as many optimal gene subsets as possible. Due to the curse of dimensionality, this type of method could suffer from over-fitting and selection bias problems. To address these potential problems, a HBSA-based ensemble classifier is constructed using majority voting strategy from individual classifiers constructed by the selected gene subsets, and a novel HBSA-based gene ranking method is designed to find important tumor-related genes by measuring the significance of genes using their occurrence frequencies in the selected gene subsets. The experimental results on nine tumor datasets including three pairs of cross-platform datasets indicate that the proposed method can not only obtain better generalization performance but also find many important tumor-related genes. Conclusions It is found that the frequencies of the selected genes follow a power-law distribution, indicating that only a few top-ranked genes can be used as potential diagnosis biomarkers. Moreover, the top-ranked genes leading to very high prediction accuracy are closely related to specific tumor subtype and even hub genes. Compared with other related methods, the proposed method can achieve higher prediction accuracy with fewer genes. Moreover, they are further justified by analyzing the top-ranked genes in the context of individual gene function, biological pathway, and protein-protein interaction network. PMID:22830977

  3. Classification of early-stage non-small cell lung cancer by weighing gene expression profiles with connectivity information.

    PubMed

    Zhang, Ao; Tian, Suyan

    2018-05-01

    Pathway-based feature selection algorithms, which utilize biological information contained in pathways to guide which features/genes should be selected, have evolved quickly and become widespread in the field of bioinformatics. Based on how the pathway information is incorporated, we classify pathway-based feature selection algorithms into three major categories-penalty, stepwise forward, and weighting. Compared to the first two categories, the weighting methods have been underutilized even though they are usually the simplest ones. In this article, we constructed three different genes' connectivity information-based weights for each gene and then conducted feature selection upon the resulting weighted gene expression profiles. Using both simulations and a real-world application, we have demonstrated that when the data-driven connectivity information constructed from the data of specific disease under study is considered, the resulting weighted gene expression profiles slightly outperform the original expression profiles. In summary, a big challenge faced by the weighting method is how to estimate pathway knowledge-based weights more accurately and precisely. Only until the issue is conquered successfully will wide utilization of the weighting methods be impossible. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A simple and reliable multi-gene transformation method for switchgrass.

    PubMed

    Ogawa, Yoichi; Shirakawa, Makoto; Koumoto, Yasuko; Honda, Masaho; Asami, Yuki; Kondo, Yasuhiro; Hara-Nishimura, Ikuko

    2014-07-01

    A simple and reliable Agrobacterium -mediated transformation method was developed for switchgrass. Using this method, many transgenic plants carrying multiple genes-of-interest could be produced without untransformed escape. Switchgrass (Panicum virgatum L.) is a promising biomass crop for bioenergy. To obtain transgenic switchgrass plants carrying a multi-gene trait in a simple manner, an Agrobacterium-mediated transformation method was established by constructing a Gateway-based binary vector, optimizing transformation conditions and developing a novel selection method. A MultiRound Gateway-compatible destination binary vector carrying the bar selectable marker gene, pHKGB110, was constructed to introduce multiple genes of interest in a single transformation. Two reporter gene expression cassettes, GUSPlus and gfp, were constructed independently on two entry vectors and then introduced into a single T-DNA region of pHKGB110 via sequential LR reactions. Agrobacterium tumefaciens EHA101 carrying the resultant binary vector pHKGB112 and caryopsis-derived compact embryogenic calli were used for transformation experiments. Prolonged cocultivation for 7 days followed by cultivation on media containing meropenem improved transformation efficiency without overgrowth of Agrobacterium, which was, however, not inhibited by cefotaxime or Timentin. In addition, untransformed escape shoots were completely eliminated during the rooting stage by direct dipping the putatively transformed shoots into the herbicide Basta solution for a few seconds, designated as the 'herbicide dipping method'. It was also demonstrated that more than 90 % of the bar-positive transformants carried both reporters delivered from pHKGB112. This simple and reliable transformation method, which incorporates a new selection technique and the use of a MultiRound Gateway-based binary vector, would be suitable for producing a large number of transgenic lines carrying multiple genes.

  5. The Performance of IRT Model Selection Methods with Mixed-Format Tests

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.

    2012-01-01

    When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the…

  6. Index Fund Selections with GAs and Classifications Based on Turnover

    NASA Astrophysics Data System (ADS)

    Orito, Yukiko; Motoyama, Takaaki; Yamazaki, Genji

    It is well known that index fund selections are important for the risk hedge of investment in a stock market. The`selection’means that for`stock index futures’, n companies of all ones in the market are selected. For index fund selections, Orito et al.(6) proposed a method consisting of the following two steps : Step 1 is to select N companies in the market with a heuristic rule based on the coefficient of determination between the return rate of each company in the market and the increasing rate of the stock price index. Step 2 is to construct a group of n companies by applying genetic algorithms to the set of N companies. We note that the rule of Step 1 is not unique. The accuracy of the results using their method depends on the length of time data (price data) in the experiments. The main purpose of this paper is to introduce a more`effective rule’for Step 1. The rule is based on turnover. The method consisting of Step 1 based on turnover and Step 2 is examined with numerical experiments for the 1st Section of Tokyo Stock Exchange. The results show that with our method, it is possible to construct the more effective index fund than the results of Orito et al.(6). The accuracy of the results using our method depends little on the length of time data (turnover data). The method especially works well when the increasing rate of the stock price index over a period can be viewed as a linear time series data.

  7. Spectrum-based method to generate good decoy libraries for spectral library searching in peptide identifications.

    PubMed

    Cheng, Chia-Ying; Tsai, Chia-Feng; Chen, Yu-Ju; Sung, Ting-Yi; Hsu, Wen-Lian

    2013-05-03

    As spectral library searching has received increasing attention for peptide identification, constructing good decoy spectra from the target spectra is the key to correctly estimating the false discovery rate in searching against the concatenated target-decoy spectral library. Several methods have been proposed to construct decoy spectral libraries. Most of them construct decoy peptide sequences and then generate theoretical spectra accordingly. In this paper, we propose a method, called precursor-swap, which directly constructs decoy spectral libraries directly at the "spectrum level" without generating decoy peptide sequences by swapping the precursors of two spectra selected according to a very simple rule. Our spectrum-based method does not require additional efforts to deal with ion types (e.g., a, b or c ions), fragment mechanism (e.g., CID, or ETD), or unannotated peaks, but preserves many spectral properties. The precursor-swap method is evaluated on different spectral libraries and the results of obtained decoy ratios show that it is comparable to other methods. Notably, it is efficient in time and memory usage for constructing decoy libraries. A software tool called Precursor-Swap-Decoy-Generation (PSDG) is publicly available for download at http://ms.iis.sinica.edu.tw/PSDG/.

  8. Data Dependent Peak Model Based Spectrum Deconvolution for Analysis of High Resolution LC-MS Data

    PubMed Central

    2015-01-01

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  9. Group Selection and Learning for a Lab-Based Construction Management Course

    ERIC Educational Resources Information Center

    Solanki, Pranshoo; Kothari, Nidhi

    2014-01-01

    In construction industries' projects, working in groups is a normal practice. Group work in a classroom is defined as students working collaboratively in a group so that everyone can participate on a collective task. The results from literature review indicate that group work is more effective method of learning as compared to individual work.…

  10. Robotic Construction Kits as Computational Manipulatives for Learning in the STEM Disciplines

    ERIC Educational Resources Information Center

    Sullivan, Florence R.; Heffernan, John

    2016-01-01

    This article presents a systematic review of research related to the use of robotics construction kits (RCKs) in P-12 learning in the STEM disciplines for typically developing children. The purpose of this review is to configure primarily qualitative and mixed methods findings from studies meeting our selection and quality criterion to answer the…

  11. Multi-Hierarchical Gray Correlation Analysis Applied in the Selection of Green Building Design Scheme

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Chuanghong

    2018-02-01

    As a sustainable form of ecological structure, green building is widespread concerned and advocated in society increasingly nowadays. In the survey and design phase of preliminary project construction, carrying out the evaluation and selection of green building design scheme, which is in accordance with the scientific and reasonable evaluation index system, can improve the ecological benefits of green building projects largely and effectively. Based on the new Green Building Evaluation Standard which came into effect on January 1, 2015, the evaluation index system of green building design scheme is constructed taking into account the evaluation contents related to the green building design scheme. We organized experts who are experienced in construction scheme optimization to mark and determine the weight of each evaluation index through the AHP method. The correlation degree was calculated between each evaluation scheme and ideal scheme by using multilevel gray relational analysis model and then the optimal scheme was determined. The feasibility and practicability of the evaluation method are verified by introducing examples.

  12. Research and Development Project Selection Methods at the Air Force Wright Aeronautical Laboratories.

    DTIC Science & Technology

    1985-09-01

    personal and telephone interviews. Ten individuals from each of the four AFWAL Laboratories were interrviewed. The results illustrated that few of the...680). Aaker and Tyebee. 1978. The authors constructed a model that dealt with the selection of interdependent R&D projects. The model covers three...of this research effort. Scope * The data collection method used in this study consisted of a combination of personal and telephone interviews. The

  13. Orpheus recombination : a comprehensive bacteriophage system for murine targeting vector construction by transplacement.

    PubMed

    Woltjen, Knut; Ito, Kenichi; Tsuzuki, Teruhisa; Rancourt, Derrick E

    2008-01-01

    In recent years, methods to address the simplification of targeting vector (TV) construction have been developed and validated. Based on in vivo recombination in Escherichia coli, these protocols have reduced dependence on restriction endonucleases, allowing the fabrication of complex TV constructs with relative ease. Using a methodology based on phage-plasmid recombination, we have developed a comprehensive TV construction protocol dubbed Orpheus recombination (ORE). The ORE system addresses all necessary requirements for TV construction; from the isolation of genespecific regions of homology to the deposition of selection/disruption cassettes. ORE makes use of a small recombination plasmid, which bears positive and negative selection markers and a cloned homologous "probe" region. This probe plasmid may be introduced into and excised from phage-borne murine genomic clones by two rounds of single crossover recombination. In this way, desired clones can be specifically isolated from a heterogeneous library of phage. Furthermore, if the probe region contains a designed mutation, it may be deposited seamlessly into the genomic clone. The complete removal of operational sequences allows unlimited repetition of the procedure to customize and finalize TVs within a few weeks. Successful gene-specific clone isolation, point mutations, large deletions, cassette insertions, and finally coincident clone isolation and mutagenesis have all been demonstrated with this method.

  14. Development of Advanced Nuclide Separation and Recovery Methods using Ion-Exchanhge Techniques in Nuclear Backend

    NASA Astrophysics Data System (ADS)

    Miura, Hitoshi

    The development of compact separation and recovery methods using selective ion-exchange techniques is very important for the reprocessing and high-level liquid wastes (HLLWs) treatment in the nuclear backend field. The selective nuclide separation techniques are effective for the volume reduction of wastes and the utilization of valuable nuclides, and expected for the construction of advanced nuclear fuel cycle system and the rationalization of waste treatment. In order to accomplish the selective nuclide separation, the design and synthesis of novel adsorbents are essential for the development of compact and precise separation processes. The present paper deals with the preparation of highly functional and selective hybrid microcapsules enclosing nano-adsorbents in the alginate gel polymer matrices by sol-gel methods, their characterization and the clarification of selective adsorption properties by batch and column methods. The selective separation of Cs, Pd and Re in real HLLW was further accomplished by using novel microcapsules, and an advanced nuclide separation system was proposed by the combination of selective processes using microcapsules.

  15. Familiarity effects in the construction of facial-composite images using modern software systems.

    PubMed

    Frowd, Charlie D; Skelton, Faye C; Butt, Neelam; Hassan, Amal; Fields, Stephen; Hancock, Peter J B

    2011-12-01

    We investigate the effect of target familiarity on the construction of facial composites, as used by law enforcement to locate criminal suspects. Two popular software construction methods were investigated. Participants were shown a target face that was either familiar or unfamiliar to them and constructed a composite of it from memory using a typical 'feature' system, involving selection of individual facial features, or one of the newer 'holistic' types, involving repeated selection and breeding from arrays of whole faces. This study found that composites constructed of a familiar face were named more successfully than composites of an unfamiliar face; also, naming of composites of internal and external features was equivalent for construction of unfamiliar targets, but internal features were better named than the external features for familiar targets. These findings applied to both systems, although benefit emerged for the holistic type due to more accurate construction of internal features and evidence for a whole-face advantage. STATEMENT OF RELEVANCE: This work is of relevance to practitioners who construct facial composites with witnesses to and victims of crime, as well as for software designers to help them improve the effectiveness of their composite systems.

  16. How to Select a Project Delivery Method for School Facilities

    ERIC Educational Resources Information Center

    Kalina, David

    2007-01-01

    In this article, the author discusses and explains three project delivery methods that are commonly used today in the United States. The first project delivery method mentioned is the design-bid-build, which is still the predominant method of project delivery for public works and school construction in the United States. The second is the…

  17. An application of the Pareto method in surveys to diagnose managers' and workers' perception of occupational safety and health on selected Polish construction sites.

    PubMed

    Obolewicz, Jerzy; Dąbrowski, Andrzej

    2017-11-16

    The construction industry is an important sector of the economy in Poland. According to the National Labour Inspectorate (PIP) data of 2014, the number of victims of fatal accidents in the construction sector amounted to 80 as compared with 187 injured in all other sectors of economy in Poland. This article presents the results of surveys on the impact of construction worker behaviour on the occupational safety and health outcomes. The surveys took into account the point of view of both construction site management (tactical level) and construction workers (operational level). For the analysis of results, the method of numerical taxonomy and Pareto charts was employed, which allowed the authors to identify the areas of occupational safety and health at both an operational and a tactical level, in which improvement actions needed to be proposed for workers employed in micro, small, medium and large construction enterprises.

  18. Fast online generalized multiscale finite element method using constraint energy minimization

    NASA Astrophysics Data System (ADS)

    Chung, Eric T.; Efendiev, Yalchin; Leung, Wing Tat

    2018-02-01

    Local multiscale methods often construct multiscale basis functions in the offline stage without taking into account input parameters, such as source terms, boundary conditions, and so on. These basis functions are then used in the online stage with a specific input parameter to solve the global problem at a reduced computational cost. Recently, online approaches have been introduced, where multiscale basis functions are adaptively constructed in some regions to reduce the error significantly. In multiscale methods, it is desired to have only 1-2 iterations to reduce the error to a desired threshold. Using Generalized Multiscale Finite Element Framework [10], it was shown that by choosing sufficient number of offline basis functions, the error reduction can be made independent of physical parameters, such as scales and contrast. In this paper, our goal is to improve this. Using our recently proposed approach [4] and special online basis construction in oversampled regions, we show that the error reduction can be made sufficiently large by appropriately selecting oversampling regions. Our numerical results show that one can achieve a three order of magnitude error reduction, which is better than our previous methods. We also develop an adaptive algorithm and enrich in selected regions with large residuals. In our adaptive method, we show that the convergence rate can be determined by a user-defined parameter and we confirm this by numerical simulations. The analysis of the method is presented.

  19. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. B2B collaboration method through trust values for e-supply chain integrator: a case study of Malaysian construction industry

    NASA Astrophysics Data System (ADS)

    Ab. Aziz, Norshakirah; Ahmad, Rohiza; Dhanapal Durai, Dominic

    2011-12-01

    Limited trust, cooperation and communication have been identified as some of the issues that hinder collaboration among business partners. These one also true in the acceptance of e-supply chain integrator among organizations that involve in the same industry. On top of that, the huge number of components in supply chain industry also makes it impossible to include entire supply chain components in the integrator. Hence, this study intends to propose a method for identifying "trusted" collaborators for inclusion into an e-supply chain integrator. For the purpose of constructing and validating the method, the Malaysian construction industry is chosen as the case study due to its size and importance to the economy. This paper puts forward the background of the research, some relevant literatures which lead to trust values elements formulation, data collection from Malaysian Construction Supply Chain and a glimpse of the proposed method for trusted partner selection. Future work is also presented to highlight the next step of this research.

  1. Construction and applications of exon-trapping gene-targeting vectors with a novel strategy for negative selection.

    PubMed

    Saito, Shinta; Ura, Kiyoe; Kodama, Miho; Adachi, Noritaka

    2015-06-30

    Targeted gene modification by homologous recombination provides a powerful tool for studying gene function in cells and animals. In higher eukaryotes, non-homologous integration of targeting vectors occurs several orders of magnitude more frequently than does targeted integration, making the gene-targeting technology highly inefficient. For this reason, negative-selection strategies have been employed to reduce the number of drug-resistant clones associated with non-homologous vector integration, particularly when artificial nucleases to introduce a DNA break at the target site are unavailable or undesirable. As such, an exon-trap strategy using a promoterless drug-resistance marker gene provides an effective way to counterselect non-homologous integrants. However, constructing exon-trapping targeting vectors has been a time-consuming and complicated process. By virtue of highly efficient att-mediated recombination, we successfully developed a simple and rapid method to construct plasmid-based vectors that allow for exon-trapping gene targeting. These exon-trap vectors were useful in obtaining correctly targeted clones in mouse embryonic stem cells and human HT1080 cells. Most importantly, with the use of a conditionally cytotoxic gene, we further developed a novel strategy for negative selection, thereby enhancing the efficiency of counterselection for non-homologous integration of exon-trap vectors. Our methods will greatly facilitate exon-trapping gene-targeting technologies in mammalian cells, particularly when combined with the novel negative selection strategy.

  2. Construction of reactive potential energy surfaces with Gaussian process regression: active data selection

    NASA Astrophysics Data System (ADS)

    Guan, Yafu; Yang, Shuo; Zhang, Dong H.

    2018-04-01

    Gaussian process regression (GPR) is an efficient non-parametric method for constructing multi-dimensional potential energy surfaces (PESs) for polyatomic molecules. Since not only the posterior mean but also the posterior variance can be easily calculated, GPR provides a well-established model for active learning, through which PESs can be constructed more efficiently and accurately. We propose a strategy of active data selection for the construction of PESs with emphasis on low energy regions. Through three-dimensional (3D) example of H3, the validity of this strategy is verified. The PESs for two prototypically reactive systems, namely, H + H2O ↔ H2 + OH reaction and H + CH4 ↔ H2 + CH3 reaction are reconstructed. Only 920 and 4000 points are assembled to reconstruct these two PESs respectively. The accuracy of the GP PESs is not only tested by energy errors but also validated by quantum scattering calculations.

  3. Educational Research with Real-World Data: Reducing Selection Bias with Propensity Scores

    ERIC Educational Resources Information Center

    Adelson, Jill L.

    2013-01-01

    Often it is infeasible or unethical to use random assignment in educational settings to study important constructs and questions. Hence, educational research often uses observational data, such as large-scale secondary data sets and state and school district data, and quasi-experimental designs. One method of reducing selection bias in estimations…

  4. Application of multi-criteria decision analysis in selecting of sustainable investments

    NASA Astrophysics Data System (ADS)

    Kozik, Renata

    2017-07-01

    Investors of construction projects, especially financed with public money, quite slowly adapt environmentally friendly solutions, e.g. passive buildings. Practice shows that the use of green public procurement among the public investors is negligible. Energy-saving technologies and equipment are expensive at the construction phase and investors less or not at all take into account the future operating costs. The aim of this article is to apply the method of multi-criteria analysis ELECTRE to select the best investment in terms of cost of implementation, operation, as well as the impact on the environment.

  5. Effect of Items Direction (Positive or Negative) on the Factorial Construction and Criterion Related Validity in Likert Scale

    ERIC Educational Resources Information Center

    Naji Qasem, Mamun Ali; Ahmad Gul, Showkeen Bilal

    2014-01-01

    The study was conducted to know the effect of items direction (positive or negative) on the factorial construction and criterion related validity in Likert scale. The descriptive survey research method was used for the study and the sample consisted of 510 undergraduate students selected by used random sampling technique. A scale developed by…

  6. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  7. Engineering a growth sensor to select intracellular antibodies in the cytosol of mammalian cells.

    PubMed

    Nguyen, Thuy Duong; Takasuka, Hitoshi; Kaku, Yoshihiro; Inoue, Satoshi; Nagamune, Teruyuki; Kawahara, Masahiro

    2017-07-01

    Intracellular antibodies (intrabodies) are expected to function as therapeutics as well as tools for elucidating in vivo function of proteins. In this study, we propose a novel intrabody selection method in the cytosol of mammalian cells by utilizing a growth signal, induced by the interaction of the target antigen and an scFv-c-kit growth sensor. Here, we challenge this method to select specific intrabodies against rabies virus nucleoprotein (RV-N) for the first time. As a result, we successfully select antigen-specific intrabodies from a naïve synthetic library using phage panning followed by our growth sensor-based intracellular selection method, demonstrating the feasibility of the method. Additionally, we succeed in improving the response of the growth sensor by re-engineering the linker region of its construction. Collectively, the described selection method utilizing a growth sensor may become a highly efficient platform for selection of functional intrabodies in the future. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  8. Investigation of using shrinking method in construction of Institute for Research in Fundamental Sciences Electron Linear Accelerator TW-tube (IPM TW-Linac tube)

    NASA Astrophysics Data System (ADS)

    Ghasemi, F.; Abbasi Davani, F.

    2015-06-01

    Due to Iran's growing need for accelerators in various applications, IPM's electron Linac project has been defined. This accelerator is a 15 MeV energy S-band traveling-wave accelerator which is being designed and constructed based on the klystron that has been built in Iran. Based on the design, operating mode is π /2 and the accelerating chamber consists of two 60cm long tubes with constant impedance and a 30cm long buncher. Amongst all construction methods, shrinking method is selected for construction of IPM's electron Linac tube because it has a simple procedure and there is no need for large vacuum or hydrogen furnaces. In this paper, different aspects of this method are investigated. According to the calculations, linear ratio of frequency alteration to radius change is 787.8 MHz/cm, and the maximum deformation at the tube wall where disks and the tube make contact is 2.7μ m. Applying shrinking method for construction of 8- and 24-cavity tubes results in satisfactory frequency and quality factor. Average deviations of cavities frequency of 8- and 24-cavity tubes to the design values are 0.68 MHz and 1.8 MHz respectively before tune and 0.2 MHz and 0.4 MHz after tune. Accelerating tubes, buncher, and high power couplers of IPM's electron linac are constructed using shrinking method.

  9. Introduction of structural affinity handles as a tool in selective nucleic acid separations

    NASA Technical Reports Server (NTRS)

    Willson, III, Richard Coale (Inventor); Cano, Luis Antonio (Inventor)

    2011-01-01

    The method is used for separating nucleic acids and other similar constructs. It involves selective introduction, enhancement, or stabilization of affinity handles such as single-strandedness in the undesired (or desired) nucleic acids as compared to the usual structure (e.g., double-strandedness) of the desired (or undesired) nucleic acids. The undesired (or desired) nucleic acids are separated from the desired (or undesired) nucleic acids due to capture by methods including but not limited to immobilized metal affinity chromatography, immobilized single-stranded DNA binding (SSB) protein, and immobilized oligonucleotides. The invention is useful to: remove contaminating genomic DNA from plasmid DNA; remove genomic DNA from plasmids, BACs, and similar constructs; selectively separate oligonucleotides and similar DNA fragments from their partner strands; purification of aptamers, (deoxy)-ribozymes and other highly structured nucleic acids; Separation of restriction fragments without using agarose gels; manufacture recombinant Taq polymerase or similar products that are sensitive to host genomic DNA contamination; and other applications.

  10. Basic materials and structures aspects for hypersonic transport vehicles (HTV)

    NASA Astrophysics Data System (ADS)

    Steinheil, E.; Uhse, W.

    A Mach 5 transport design is used to illustrate structural concepts and criteria for materials selections and also key technologies that must be followed in the areas of computational methods, materials and construction methods. Aside from the primary criteria of low weight, low costs, and conceivable risks, a number of additional requirements must be met, including stiffness and strength, corrosion resistance, durability, and a construction adequate for inspection, maintenance and repair. Current aircraft construction requirements are significantly extended for hypersonic vehicles. Additional consideration is given to long-duration temperature resistance of the airframe structure, the integration of large-volume cryogenic fuel tanks, computational tools, structural design, polymer matrix composites, and advanced manufacturing technologies.

  11. Method for construction of bacterial strains with increased succinic acid production

    DOEpatents

    Donnelly, Mark I.; Sanville-Millard, Cynthia; Chatterjee, Ranjini

    2000-01-01

    A fermentation process for producing succinic acid is provided comprising selecting a bacterial strain that does not produce succinic acid in high yield, disrupting the normal regulation of sugar metabolism of said bacterial strain, and combining the mutant bacterial strain and selected sugar in anaerobic conditions to facilitate production of succinic acid. Also provided is a method for changing low yield succinic acid producing bacteria to high yield succinic acid producing bacteria comprising selecting a bacterial strain having a phosphotransferase system and altering the phosphotransferase system so as to allow the bacterial strain to simultaneously metabolize different sugars.

  12. Measuring cognition in teams: a cross-domain review.

    PubMed

    Wildman, Jessica L; Salas, Eduardo; Scott, Charles P R

    2014-08-01

    The purpose of this article is twofold: to provide a critical cross-domain evaluation of team cognition measurement options and to provide novice researchers with practical guidance when selecting a measurement method. A vast selection of measurement approaches exist for measuring team cognition constructs including team mental models, transactive memory systems, team situation awareness, strategic consensus, and cognitive processes. Empirical studies and theoretical articles were reviewed to identify all of the existing approaches for measuring team cognition. These approaches were evaluated based on theoretical perspective assumed, constructs studied, resources required, level of obtrusiveness, internal consistency reliability, and predictive validity. The evaluations suggest that all existing methods are viable options from the point of view of reliability and validity, and that there are potential opportunities for cross-domain use. For example, methods traditionally used only to measure mental models may be useful for examining transactive memory and situation awareness. The selection of team cognition measures requires researchers to answer several key questions regarding the theoretical nature of team cognition and the practical feasibility of each method. We provide novice researchers with guidance regarding how to begin the search for a team cognition measure and suggest several new ideas regarding future measurement research. We provide (1) a broad overview and evaluation of existing team cognition measurement methods, (2) suggestions for new uses of those methods across research domains, and (3) critical guidance for novice researchers looking to measure team cognition.

  13. The methodic of calculation for the need of basic construction machines on construction site when developing organizational and technological documentation

    NASA Astrophysics Data System (ADS)

    Zhadanovsky, Boris; Sinenko, Sergey

    2018-03-01

    Economic indicators of construction work, particularly in high-rise construction, are directly related to the choice of optimal number of machines. The shortage of machinery makes it impossible to complete the construction & installation work on scheduled time. Rates of performance of construction & installation works and labor productivity during high-rise construction largely depend on the degree of provision of construction project with machines (level of work mechanization). During calculation of the need for machines in construction projects, it is necessary to ensure that work is completed on scheduled time, increased level of complex mechanization, increased productivity and reduction of manual work, and improved usage and maintenance of machine fleet. The selection of machines and determination of their numbers should be carried out by using formulas presented in this work.

  14. Linear reduction method for predictive and informative tag SNP selection.

    PubMed

    He, Jingwu; Westbrooks, Kelly; Zelikovsky, Alexander

    2005-01-01

    Constructing a complete human haplotype map is helpful when associating complex diseases with their related SNPs. Unfortunately, the number of SNPs is very large and it is costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNPs that should be sequenced to a small number of informative representatives called tag SNPs. In this paper, we propose a new linear algebra-based method for selecting and using tag SNPs. We measure the quality of our tag SNP selection algorithm by comparing actual SNPs with SNPs predicted from selected linearly independent tag SNPs. Our experiments show that for sufficiently long haplotypes, knowing only 0.4% of all SNPs the proposed linear reduction method predicts an unknown haplotype with the error rate below 2% based on 10% of the population.

  15. Phased Antenna Array for Global Navigation Satellite System Signals

    NASA Technical Reports Server (NTRS)

    Turbiner, Dmitry (Inventor)

    2015-01-01

    Systems and methods for phased array antennas are described. Supports for phased array antennas can be constructed by 3D printing. The array elements and combiner network can be constructed by conducting wire. Different parameters of the antenna, like the gain and directivity, can be controlled by selection of the appropriate design, and by electrical steering. Phased array antennas may be used for radio occultation measurements.

  16. Assembly planning based on subassembly extraction

    NASA Technical Reports Server (NTRS)

    Lee, Sukhan; Shin, Yeong Gil

    1990-01-01

    A method is presented for the automatic determination of assembly partial orders from a liaison graph representation of an assembly through the extraction of preferred subassemblies. In particular, the authors show how to select a set of tentative subassemblies by decomposing a liaison graph into a set of subgraphs based on feasibility and difficulty of disassembly, how to evaluate each of the tentative subassemblies in terms of assembly cost using the subassembly selection indices, and how to construct a hierarchical partial order graph (HPOG) as an assembly plan. The method provides an approach to assembly planning by identifying spatial parallelism in assembly as a means of constructing temporal relationships among assembly operations and solves the problem of finding a cost-effective assembly plan in a flexible environment. A case study of the assembly planning of a mechanical assembly is presented.

  17. Tier One Performance Screen Initial Operational Test and Evaluation: 2012 Interim Report

    DTIC Science & Technology

    2013-12-01

    are known to predict outcomes in work settings. Because the TAPAS uses item response theory (IRT) methods to construct and score items, it can be...Qualification Test (AFQT), to select new Soldiers. Although the AFQT is useful for selecting new Soldiers, other personal attributes are important to...to be and will continue to serve as a useful metric for selecting new Soldiers, other personal attributes, in particular non-cognitive attributes

  18. Splitting nodes and linking channels: A method for assembling biocircuits from stochastic elementary units

    NASA Astrophysics Data System (ADS)

    Ferwerda, Cameron; Lipan, Ovidiu

    2016-11-01

    Akin to electric circuits, we construct biocircuits that are manipulated by cutting and assembling channels through which stochastic information flows. This diagrammatic manipulation allows us to create a method which constructs networks by joining building blocks selected so that (a) they cover only basic processes; (b) it is scalable to large networks; (c) the mean and variance-covariance from the Pauli master equation form a closed system; and (d) given the initial probability distribution, no special boundary conditions are necessary to solve the master equation. The method aims to help with both designing new synthetic signaling pathways and quantifying naturally existing regulatory networks.

  19. Intelligent fault diagnosis of rolling bearings using an improved deep recurrent neural network

    NASA Astrophysics Data System (ADS)

    Jiang, Hongkai; Li, Xingqiu; Shao, Haidong; Zhao, Ke

    2018-06-01

    Traditional intelligent fault diagnosis methods for rolling bearings heavily depend on manual feature extraction and feature selection. For this purpose, an intelligent deep learning method, named the improved deep recurrent neural network (DRNN), is proposed in this paper. Firstly, frequency spectrum sequences are used as inputs to reduce the input size and ensure good robustness. Secondly, DRNN is constructed by the stacks of the recurrent hidden layer to automatically extract the features from the input spectrum sequences. Thirdly, an adaptive learning rate is adopted to improve the training performance of the constructed DRNN. The proposed method is verified with experimental rolling bearing data, and the results confirm that the proposed method is more effective than traditional intelligent fault diagnosis methods.

  20. Optimization of the choice of unmanned aerial vehicles used to monitor the implementation of selected construction projects

    NASA Astrophysics Data System (ADS)

    Skorupka, Dariusz; Duchaczek, Artur; Waniewska, Agnieszka; Kowacka, Magdalena

    2017-07-01

    Due to their properties unmanned aerial vehicles have huge number of possibilities for application in construction engineering. The nature and extent of construction works performedmakes the decision to purchase the right equipment significant for the possibility for its further use while monitoring the implementation of these works. Technical factors, such as the accuracy and quality of the applied measurement instruments are especially important when monitoring the realization of construction projects. The paper presents the optimization of the choice of unmanned aerial vehicles using the Bellinger method. The decision-making analysis takes into account criteria that are particularly crucial by virtue of the range of monitoring of ongoing construction works.

  1. Program for the development of high temperature electrical materials and components

    NASA Technical Reports Server (NTRS)

    Neff, W. S.; Lowry, L. R.

    1972-01-01

    Evaluation of high temperature, space-vacuum performance of selected electrical materials and components, high temperature capacitor development, and evaluation, construction, and endurance testing of compression sealed pyrolytic boron nitride slot insulation are described. The first subject above covered the aging evaluation of electrical devices constructed from selected electrical materials. Individual materials performances were also evaluated and reported. The second subject included study of methods of improving electrical performance of pyrolytic boron nitride capacitors. The third portion was conducted to evaluate the thermal and electrical performance of pyrolytic boron nitride as stator slot liner material under varied temperature and compressive loading. Conclusions and recommendations are presented.

  2. Construction of human antibody gene libraries and selection of antibodies by phage display.

    PubMed

    Schirrmann, Thomas; Hust, Michael

    2010-01-01

    Recombinant antibodies as therapeutics offer new opportunities for the treatment of many tumor diseases. To date, 18 antibody-based drugs are approved for cancer treatment and hundreds of anti-tumor antibodies are under development. The first clinically approved antibodies were of murine origin or human-mouse chimeric. However, since murine antibody domains are immunogenic in human patients and could result in human anti-mouse antibody (HAMA) responses, currently mainly humanized and fully human antibodies are developed for therapeutic applications.Here, in vitro antibody selection technologies directly allow the selection of human antibodies and the corresponding genes from human antibody gene libraries. Antibody phage display is the most common way to generate human antibodies and has already yielded thousands of recombinant antibodies for research, diagnostics and therapy. Here, we describe methods for the construction of human scFv gene libraries and the antibody selection.

  3. Software development to support decision making in the selection of nursing diagnoses and interventions for children and adolescents1

    PubMed Central

    Silva, Kenya de Lima; Évora, Yolanda Dora Martinez; Cintra, Camila Santana Justo

    2015-01-01

    Objective: to report the development of a software to support decision-making for the selection of nursing diagnoses and interventions for children and adolescents, based on the nomenclature of nursing diagnoses, outcomes and interventions of a university hospital in Paraiba. Method: a methodological applied study based on software engineering, as proposed by Pressman, developed in three cycles, namely: flow chart construction, development of the navigation interface, and construction of functional expressions and programming development. Result: the software consists of administrative and nursing process screens. The assessment is automatically selected according to age group, the nursing diagnoses are suggested by the system after information is inserted, and can be indicated by the nurse. The interventions for the chosen diagnosis are selected by structuring the care plan. Conclusion: the development of this tool used to document the nursing actions will contribute to decision-making and quality of care. PMID:26487144

  4. Feature selection and classification of multiparametric medical images using bagging and SVM

    NASA Astrophysics Data System (ADS)

    Fan, Yong; Resnick, Susan M.; Davatzikos, Christos

    2008-03-01

    This paper presents a framework for brain classification based on multi-parametric medical images. This method takes advantage of multi-parametric imaging to provide a set of discriminative features for classifier construction by using a regional feature extraction method which takes into account joint correlations among different image parameters; in the experiments herein, MRI and PET images of the brain are used. Support vector machine classifiers are then trained based on the most discriminative features selected from the feature set. To facilitate robust classification and optimal selection of parameters involved in classification, in view of the well-known "curse of dimensionality", base classifiers are constructed in a bagging (bootstrap aggregating) framework for building an ensemble classifier and the classification parameters of these base classifiers are optimized by means of maximizing the area under the ROC (receiver operating characteristic) curve estimated from their prediction performance on left-out samples of bootstrap sampling. This classification system is tested on a sex classification problem, where it yields over 90% classification rates for unseen subjects. The proposed classification method is also compared with other commonly used classification algorithms, with favorable results. These results illustrate that the methods built upon information jointly extracted from multi-parametric images have the potential to perform individual classification with high sensitivity and specificity.

  5. A multi-label learning based kernel automatic recommendation method for support vector machine.

    PubMed

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance.

  6. A Multi-Label Learning Based Kernel Automatic Recommendation Method for Support Vector Machine

    PubMed Central

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance. PMID:25893896

  7. Reinforcement Learning Trees

    PubMed Central

    Zhu, Ruoqing; Zeng, Donglin; Kosorok, Michael R.

    2015-01-01

    In this paper, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as random forests (Breiman, 2001) under high-dimensional settings. The innovations are three-fold. First, the new method implements reinforcement learning at each selection of a splitting variable during the tree construction processes. By splitting on the variable that brings the greatest future improvement in later splits, rather than choosing the one with largest marginal effect from the immediate split, the constructed tree utilizes the available samples in a more efficient way. Moreover, such an approach enables linear combination cuts at little extra computational cost. Second, we propose a variable muting procedure that progressively eliminates noise variables during the construction of each individual tree. The muting procedure also takes advantage of reinforcement learning and prevents noise variables from being considered in the search for splitting rules, so that towards terminal nodes, where the sample size is small, the splitting rules are still constructed from only strong variables. Last, we investigate asymptotic properties of the proposed method under basic assumptions and discuss rationale in general settings. PMID:26903687

  8. Integration of Occupational Safety to Contractors` or Subcontractors` Performance Evaluation in Construction Projects

    NASA Astrophysics Data System (ADS)

    Kozlovská, Mária; Struková, Zuzana

    2013-06-01

    Several factors should be considered by the owner and general contractor in the process of contractors` and subcontractors` selection and evaluation. The paper reviews the recent models addressed to guide general contractors in subcontractors' selection process and in evaluation of different contractors during the execution of the project. Moreover the paper suggests the impact of different contractors' performance to the overall level of occupational health and safety culture at the sites. It deals with the factors influencing the safety performance of contractors during construction and analyses the methods for assessing the safety performance of construction contractors. The results of contractors' safety performance evaluation could be a useful tool in motivating contractors to achieve better safety outcomes or could have effect on owners` or general contractors' decision making about contractors suitability for future contracting works.

  9. Pesticide applicators questionnaire content validation: A fuzzy delphi method.

    PubMed

    Manakandan, S K; Rosnah, I; Mohd Ridhuan, J; Priya, R

    2017-08-01

    The most crucial step in forming a set of survey questionnaire is deciding the appropriate items in a construct. Retaining irrelevant items and removing important items will certainly mislead the direction of a particular study. This article demonstrates Fuzzy Delphi method as one of the scientific analysis technique to consolidate consensus agreement within a panel of experts pertaining to each item's appropriateness. This method reduces the ambiguity, diversity, and discrepancy of the opinions among the experts hence enhances the quality of the selected items. The main purpose of this study was to obtain experts' consensus on the suitability of the preselected items on the questionnaire. The panel consists of sixteen experts from the Occupational and Environmental Health Unit of Ministry of Health, Vector-borne Disease Control Unit of Ministry of Health and Occupational and Safety Health Unit of both public and private universities. A set of questionnaires related to noise and chemical exposure were compiled based on the literature search. There was a total of six constructs with 60 items in which three constructs for knowledge, attitude, and practice of noise exposure and three constructs for knowledge, attitude, and practice of chemical exposure. The validation process replicated recent Fuzzy Delphi method that using a concept of Triangular Fuzzy Numbers and Defuzzification process. A 100% response rate was obtained from all the sixteen experts with an average Likert scoring of four to five. Post FDM analysis, the first prerequisite was fulfilled with a threshold value (d) ≤ 0.2, hence all the six constructs were accepted. For the second prerequisite, three items (21%) from noise-attitude construct and four items (40%) from chemical-practice construct had expert consensus lesser than 75%, which giving rise to about 12% from the total items in the questionnaire. The third prerequisite was used to rank the items within the constructs by calculating the average fuzzy numbers. The seven items which did not fulfill the second prerequisite similarly had lower ranks during the analysis, therefore those items were discarded from the final draft. Post FDM analysis, the experts' consensus on the suitability of the pre-selected items on the questionnaire set were obtained, hence it is now ready for further construct validation process.

  10. Assessment of Useful Plants in the Catchment Area of the Proposed Ntabelanga Dam in the Eastern Cape Province, South Africa

    PubMed Central

    2017-01-01

    Background The developmental projects, particularly construction of dams, result in permanent changes of terrestrial ecosystems through inundation. Objective The present study was undertaken aiming at documenting useful plant species in Ntabelanga dam catchment area that will be impacted by the construction of the proposed dam. Methods A total of 55 randomly selected quadrats were used to assess plant species diversity and composition. Participatory rural appraisal (PRA) methods were used to identify useful plant species growing in the catchment area through interviews with 108 randomly selected participants. Results A total of 197 plant species were recorded with 95 species (48.2%) utilized for various purposes. Use categories included ethnoveterinary and herbal medicines (46 species), food plants (37 species), construction timber and thatching (14 species), firewood (five species), browse, live fence, and ornamental (four species each), and brooms and crafts (two species). Conclusion This study showed that plant species play an important role in the daily life and culture of local people. The construction of Ntabelanga dam is, therefore, associated with several positive and negative impacts on plant resources which are not fully integrated into current decision-making, largely because of lack of multistakeholder dialogue on the socioeconomic issues of such an important project. PMID:28828397

  11. What do Demand-Control and Effort-Reward work stress questionnaires really measure? A discriminant content validity study of relevance and representativeness of measures.

    PubMed

    Bell, Cheryl; Johnston, Derek; Allan, Julia; Pollard, Beth; Johnston, Marie

    2017-05-01

    The Demand-Control (DC) and Effort-Reward Imbalance (ERI) models predict health in a work context. Self-report measures of the four key constructs (demand, control, effort, and reward) have been developed and it is important that these measures have good content validity uncontaminated by content from other constructs. We assessed relevance (whether items reflect the constructs) and representativeness (whether all aspects of the construct are assessed, and all items contribute to that assessment) across the instruments and items. Two studies examined fourteen demand/control items from the Job Content Questionnaire and seventeen effort/reward items from the Effort-Reward Imbalance measure using discriminant content validation and a third study developed new methods to assess instrument representativeness. Both methods use judges' ratings and construct definitions to get transparent quantitative estimates of construct validity. Study 1 used dictionary definitions while studies 2 and 3 used published phrases to define constructs. Overall, 3/5 demand items, 4/9 control items, 1/6 effort items, and 7/11 reward items were uniquely classified to the appropriate theoretical construct and were therefore 'pure' items with discriminant content validity (DCV). All pure items measured a defining phrase. However, both the DC and ERI assessment instruments failed to assess all defining aspects. Finding good discriminant content validity for demand and reward measures means these measures are usable and our quantitative results can guide item selection. By contrast, effort and control measures had limitations (in relevance and representativeness) presenting a challenge to the implementation of the theories. Statement of contribution What is already known on this subject? While the reliability and construct validity of Demand-Control and Effort-Reward-Imbalance (DC and ERI) work stress measures are routinely reported, there has not been adequate investigation of their content validity. This paper investigates their content validity in terms of both relevance and representativeness and provides a model for the investigation of content validity of measures in health psychology more generally. What does this study add? A new application of an existing method, discriminant content validity, and a new method of assessing instrument representativeness. 'Pure' DC and ERI items are identified, as are constructs that are not fully represented by their assessment instruments. The findings are important for studies attempting to distinguish between the main DC and ERI work stress constructs. The quantitative results can be used to guide item selection for future studies. © 2017 The British Psychological Society.

  12. Selection of representative embankments based on rough set - fuzzy clustering method

    NASA Astrophysics Data System (ADS)

    Bin, Ou; Lin, Zhi-xiang; Fu, Shu-yan; Gao, Sheng-song

    2018-02-01

    The premise condition of comprehensive evaluation of embankment safety is selection of representative unit embankment, on the basis of dividing the unit levee the influencing factors and classification of the unit embankment are drafted.Based on the rough set-fuzzy clustering, the influence factors of the unit embankment are measured by quantitative and qualitative indexes.Construct to fuzzy similarity matrix of standard embankment then calculate fuzzy equivalent matrix of fuzzy similarity matrix by square method. By setting the threshold of the fuzzy equivalence matrix, the unit embankment is clustered, and the representative unit embankment is selected from the classification of the embankment.

  13. Design of solar thermal dryers for 24-hour food drying processes (abstract)

    USDA-ARS?s Scientific Manuscript database

    Solar drying is a ubiquitous method that has been adopted for many years as a food preservation method. Most of the published articles in the literature provide insight on the performance of solar dryers in service but little information on the dryer construction material selection process or mater...

  14. Modified signal-to-noise: a new simple and practical gene filtering approach based on the concept of projective adaptive resonance theory (PART) filtering method.

    PubMed

    Takahashi, Hiro; Honda, Hiroyuki

    2006-07-01

    Considering the recent advances in and the benefits of DNA microarray technologies, many gene filtering approaches have been employed for the diagnosis and prognosis of diseases. In our previous study, we developed a new filtering method, namely, the projective adaptive resonance theory (PART) filtering method. This method was effective in subclass discrimination. In the PART algorithm, the genes with a low variance in gene expression in either class, not both classes, were selected as important genes for modeling. Based on this concept, we developed novel simple filtering methods such as modified signal-to-noise (S2N') in the present study. The discrimination model constructed using these methods showed higher accuracy with higher reproducibility as compared with many conventional filtering methods, including the t-test, S2N, NSC and SAM. The reproducibility of prediction was evaluated based on the correlation between the sets of U-test p-values on randomly divided datasets. With respect to leukemia, lymphoma and breast cancer, the correlation was high; a difference of >0.13 was obtained by the constructed model by using <50 genes selected by S2N'. Improvement was higher in the smaller genes and such higher correlation was observed when t-test, NSC and SAM were used. These results suggest that these modified methods, such as S2N', have high potential to function as new methods for marker gene selection in cancer diagnosis using DNA microarray data. Software is available upon request.

  15. Gradient SiNO anti-reflective layers in solar selective coatings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Zhifeng; Cao, Feng; Sun, Tianyi

    A solar selective coating includes a substrate, a cermet layer having nanoparticles therein deposited on the substrate, and an anti-reflection layer deposited on the cermet layer. The cermet layer and the anti-reflection layer may each be formed of intermediate layers. A method for constructing a solar-selective coating is disclosed and includes preparing a substrate, depositing a cermet layer on the substrate, and depositing an anti-reflection layer on the cermet layer.

  16. Comparison of methods for library construction and short read annotation of shellfish viral metagenomes.

    PubMed

    Wei, Hong-Ying; Huang, Sheng; Wang, Jiang-Yong; Gao, Fang; Jiang, Jing-Zhe

    2018-03-01

    The emergence and widespread use of high-throughput sequencing technologies have promoted metagenomic studies on environmental or animal samples. Library construction for metagenome sequencing and annotation of the produced sequence reads are important steps in such studies and influence the quality of metagenomic data. In this study, we collected some marine mollusk samples, such as Crassostrea hongkongensis, Chlamys farreri, and Ruditapes philippinarum, from coastal areas in South China. These samples were divided into two batches to compare two library construction methods for shellfish viral metagenome. Our analysis showed that reverse-transcribing RNA into cDNA and then amplifying it simultaneously with DNA by whole genome amplification (WGA) yielded a larger amount of DNA compared to using only WGA or WTA (whole transcriptome amplification). Moreover, higher quality libraries were obtained by agarose gel extraction rather than with AMPure bead size selection. However, the latter can also provide good results if combined with the adjustment of the filter parameters. This, together with its simplicity, makes it a viable alternative. Finally, we compared three annotation tools (BLAST, DIAMOND, and Taxonomer) and two reference databases (NCBI's NR and Uniprot's Uniref). Considering the limitations of computing resources and data transfer speed, we propose the use of DIAMOND with Uniref for annotating metagenomic short reads as its running speed can guarantee a good annotation rate. This study may serve as a useful reference for selecting methods for Shellfish viral metagenome library construction and read annotation.

  17. Identification of solid state fermentation degree with FT-NIR spectroscopy: Comparison of wavelength variable selection methods of CARS and SCARS.

    PubMed

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-01-01

    The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Identification of solid state fermentation degree with FT-NIR spectroscopy: Comparison of wavelength variable selection methods of CARS and SCARS

    NASA Astrophysics Data System (ADS)

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-10-01

    The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree.

  19. Safety Early Warning Research for Highway Construction Based on Case-Based Reasoning and Variable Fuzzy Sets

    PubMed Central

    Liu, Yan; Xu, Zhen-Jun

    2013-01-01

    As a high-risk subindustry involved in construction projects, highway construction safety has experienced major developments in the past 20 years, mainly due to the lack of safe early warnings in Chinese construction projects. By combining the current state of early warning technology with the requirements of the State Administration of Work Safety and using case-based reasoning (CBR), this paper expounds on the concept and flow of highway construction safety early warnings based on CBR. The present study provides solutions to three key issues, index selection, accident cause association analysis, and warning degree forecasting implementation, through the use of association rule mining, support vector machine classifiers, and variable fuzzy qualitative and quantitative change criterion modes, which fully cover the needs of safe early warning systems. Using a detailed description of the principles and advantages of each method and by proving the methods' effectiveness and ability to act together in safe early warning applications, effective means and intelligent technology for a safe highway construction early warning system are established. PMID:24191134

  20. Safety early warning research for highway construction based on case-based reasoning and variable fuzzy sets.

    PubMed

    Liu, Yan; Yi, Ting-Hua; Xu, Zhen-Jun

    2013-01-01

    As a high-risk subindustry involved in construction projects, highway construction safety has experienced major developments in the past 20 years, mainly due to the lack of safe early warnings in Chinese construction projects. By combining the current state of early warning technology with the requirements of the State Administration of Work Safety and using case-based reasoning (CBR), this paper expounds on the concept and flow of highway construction safety early warnings based on CBR. The present study provides solutions to three key issues, index selection, accident cause association analysis, and warning degree forecasting implementation, through the use of association rule mining, support vector machine classifiers, and variable fuzzy qualitative and quantitative change criterion modes, which fully cover the needs of safe early warning systems. Using a detailed description of the principles and advantages of each method and by proving the methods' effectiveness and ability to act together in safe early warning applications, effective means and intelligent technology for a safe highway construction early warning system are established.

  1. Opportunistic Constructive Induction: Using Fragments of Domain Knowledge to Guide Construction

    DTIC Science & Technology

    1991-01-01

    contribute to its success in ways which they are often unaware of and, unfortunately, which go by unacknowledged. At the risk of omitting some people I am...ference on Machine Learning, pages 322 -329, Austin, TX, June 1990. [Blumer e_1 al... 19871 Anselm Blurner, Andrzej Ehrenfeucht, David Haussler, and...and Ryszard S. Michalki. A Comparative Review of Selected Methods fur Learning from Examples. In- Machine Learning: An A rtificialIntelligence

  2. Composite pipe to metal joint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leslie, James C.; Leslie, II, James C.; Heard, James

    A method for making a metal to composite tube joint including selecting an elongated interior fitting constructed with an exterior barrel, reduced in exterior diameter to form a distally facing annular shoulder and then projecting still further distally to form an interior sleeve having a radially outwardly facing bonding surface. Selecting an elongated metal outer sleeve formed proximally with a collar constructed for receipt over the barrel and increased in interior diameter and projecting distally to form an exterior sleeve having a radially inwardly facing bonding surface cooperating with the first bonding surface to form an annulus receiving an extremitymore » of a composite tube and a bond bonding the extremity of the tube to the bonding surfaces.« less

  3. Tailoring Selective Laser Melting Process Parameters for NiTi Implants

    NASA Astrophysics Data System (ADS)

    Bormann, Therese; Schumacher, Ralf; Müller, Bert; Mertmann, Matthias; de Wild, Michael

    2012-12-01

    Complex-shaped NiTi constructions become more and more essential for biomedical applications especially for dental or cranio-maxillofacial implants. The additive manufacturing method of selective laser melting allows realizing complex-shaped elements with predefined porosity and three-dimensional micro-architecture directly out of the design data. We demonstrate that the intentional modification of the applied energy during the SLM-process allows tailoring the transformation temperatures of NiTi entities within the entire construction. Differential scanning calorimetry, x-ray diffraction, and metallographic analysis were employed for the thermal and structural characterizations. In particular, the phase transformation temperatures, the related crystallographic phases, and the formed microstructures of SLM constructions were determined for a series of SLM-processing parameters. The SLM-NiTi exhibits pseudoelastic behavior. In this manner, the properties of NiTi implants can be tailored to build smart implants with pre-defined micro-architecture and advanced performance.

  4. The Effects of Computer-Supported Inquiry-Based Learning Methods and Peer Interaction on Learning Stellar Parallax

    ERIC Educational Resources Information Center

    Ruzhitskaya, Lanika

    2011-01-01

    The presented research study investigated the effects of computer-supported inquiry-based learning and peer interaction methods on effectiveness of learning a scientific concept. The stellar parallax concept was selected as a basic, and yet important in astronomy, scientific construct, which is based on a straightforward relationship of several…

  5. Enhancing prediction power of chemometric models through manipulation of the fed spectrophotometric data: A comparative study

    NASA Astrophysics Data System (ADS)

    Saad, Ahmed S.; Hamdy, Abdallah M.; Salama, Fathy M.; Abdelkawy, Mohamed

    2016-10-01

    Effect of data manipulation in preprocessing step proceeding construction of chemometric models was assessed. The same set of UV spectral data was used for construction of PLS and PCR models directly and after mathematically manipulation as per well known first and second derivatives of the absorption spectra, ratio spectra and first and second derivatives of the ratio spectra spectrophotometric methods, meanwhile the optimal working wavelength ranges were carefully selected for each model and the models were constructed. Unexpectedly, number of latent variables used for models' construction varied among the different methods. The prediction power of the different models was compared using a validation set of 8 mixtures prepared as per the multilevel multifactor design and results were statistically compared using two-way ANOVA test. Root mean squares error of prediction (RMSEP) was used for further comparison of the predictability among different constructed models. Although no significant difference was found between results obtained using Partial Least Squares (PLS) and Principal Component Regression (PCR) models, however, discrepancies among results was found to be attributed to the variation in the discrimination power of adopted spectrophotometric methods on spectral data.

  6. Monitoring the Error Rate of Modern Methods of Construction Based on Wood

    NASA Astrophysics Data System (ADS)

    Švajlenka, Jozef; Kozlovská, Mária

    2017-06-01

    A range of new and innovative construction systems, currently developed, represent modern methods of construction (MMC), which has the ambition to improve the performance parameters of buildings throughout their life cycle. Regarding the implementation modern methods of construction in Slovakia, assembled buildings based on wood seem to be the most preferred construction system. In the study, presented in the paper, were searched already built and lived-in wood based family houses. The residents' attitudes to such type of buildings in the context with declared designing and qualitative parameters of efficiency and sustainability are overlooked. The methodology of the research study is based on the socio-economic survey carried out during the years 2015 - 2017 within the Slovak Republic. Due to the large extent of data collected through questionnaire, only selected parts of the survey results are evaluated and discussed in the paper. This paper is aimed at evaluating the quality of buildings expressed in a view of users of existing wooden buildings. Research indicates some defects, which can be eliminated in the next production process. Research indicates, that some defects occur, so the production process quality should be improved in the future development.

  7. Niche construction, sources of selection and trait coevolution.

    PubMed

    Laland, Kevin; Odling-Smee, John; Endler, John

    2017-10-06

    Organisms modify and choose components of their local environments. This 'niche construction' can alter ecological processes, modify natural selection and contribute to inheritance through ecological legacies. Here, we propose that niche construction initiates and modifies the selection directly affecting the constructor, and on other species, in an orderly, directed and sustained manner. By dependably generating specific environmental states, niche construction co-directs adaptive evolution by imposing a consistent statistical bias on selection. We illustrate how niche construction can generate this evolutionary bias by comparing it with artificial selection. We suggest that it occupies the middle ground between artificial and natural selection. We show how the perspective leads to testable predictions related to: (i) reduced variance in measures of responses to natural selection in the wild; (ii) multiple trait coevolution, including the evolution of sequences of traits and patterns of parallel evolution; and (iii) a positive association between niche construction and biodiversity. More generally, we submit that evolutionary biology would benefit from greater attention to the diverse properties of all sources of selection.

  8. Construct-level predictive validity of educational attainment and intellectual aptitude tests in medical student selection: meta-regression of six UK longitudinal studies

    PubMed Central

    2013-01-01

    Background Measures used for medical student selection should predict future performance during training. A problem for any selection study is that predictor-outcome correlations are known only in those who have been selected, whereas selectors need to know how measures would predict in the entire pool of applicants. That problem of interpretation can be solved by calculating construct-level predictive validity, an estimate of true predictor-outcome correlation across the range of applicant abilities. Methods Construct-level predictive validities were calculated in six cohort studies of medical student selection and training (student entry, 1972 to 2009) for a range of predictors, including A-levels, General Certificates of Secondary Education (GCSEs)/O-levels, and aptitude tests (AH5 and UK Clinical Aptitude Test (UKCAT)). Outcomes included undergraduate basic medical science and finals assessments, as well as postgraduate measures of Membership of the Royal Colleges of Physicians of the United Kingdom (MRCP(UK)) performance and entry in the Specialist Register. Construct-level predictive validity was calculated with the method of Hunter, Schmidt and Le (2006), adapted to correct for right-censorship of examination results due to grade inflation. Results Meta-regression analyzed 57 separate predictor-outcome correlations (POCs) and construct-level predictive validities (CLPVs). Mean CLPVs are substantially higher (.450) than mean POCs (.171). Mean CLPVs for first-year examinations, were high for A-levels (.809; CI: .501 to .935), and lower for GCSEs/O-levels (.332; CI: .024 to .583) and UKCAT (mean = .245; CI: .207 to .276). A-levels had higher CLPVs for all undergraduate and postgraduate assessments than did GCSEs/O-levels and intellectual aptitude tests. CLPVs of educational attainment measures decline somewhat during training, but continue to predict postgraduate performance. Intellectual aptitude tests have lower CLPVs than A-levels or GCSEs/O-levels. Conclusions Educational attainment has strong CLPVs for undergraduate and postgraduate performance, accounting for perhaps 65% of true variance in first year performance. Such CLPVs justify the use of educational attainment measure in selection, but also raise a key theoretical question concerning the remaining 35% of variance (and measurement error, range restriction and right-censorship have been taken into account). Just as in astrophysics, ‘dark matter’ and ‘dark energy’ are posited to balance various theoretical equations, so medical student selection must also have its ‘dark variance’, whose nature is not yet properly characterized, but explains a third of the variation in performance during training. Some variance probably relates to factors which are unpredictable at selection, such as illness or other life events, but some is probably also associated with factors such as personality, motivation or study skills. PMID:24229353

  9. [Production of marker-free plants expressing the gene of the hepatitis B virus surface antigen].

    PubMed

    Rukavtsova, E B; Gaiazova, A R; Chebotareva, E N; Bur'ianova, Ia I

    2009-08-01

    The pBM plasmid, carrying the gene of hepatitis B virus surface antigen (HBsAg) and free of any selection markers of antibiotic or herbicide resistance, was constructed for genetic transformation of plants. A method for screening transformed plant seedlings on nonselective media was developed. Enzyme immunoassay was used for selecting transgenic plants with HBsAg gene among the produced regenerants; this method provides for a high sensitivity detection of HBsAg in plant extracts. Tobacco and tomato transgenic lines synthesizing this antigen at a level of 0.01-0.05% of the total soluble protein were obtained. The achieved level of HBsAg synthesis is sufficient for preclinical trials of the produced plants as a new generation safe edible vaccine. The developed method for selecting transformants can be used for producing safe plants free of selection markers.

  10. Analytical network process based optimum cluster head selection in wireless sensor network.

    PubMed

    Farman, Haleem; Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of different components involved in the evaluation process.

  11. Analytical network process based optimum cluster head selection in wireless sensor network

    PubMed Central

    Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of different components involved in the evaluation process. PMID:28719616

  12. The (Un)Certainty of Selectivity in Liquid Chromatography Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Berendsen, Bjorn J. A.; Stolker, Linda A. M.; Nielen, Michel W. F.

    2013-01-01

    We developed a procedure to determine the "identification power" of an LC-MS/MS method operated in the MRM acquisition mode, which is related to its selectivity. The probability of any compound showing the same precursor ion, product ions, and retention time as the compound of interest is used as a measure of selectivity. This is calculated based upon empirical models constructed from three very large compound databases. Based upon the final probability estimation, additional measures to assure unambiguous identification can be taken, like the selection of different or additional product ions. The reported procedure in combination with criteria for relative ion abundances results in a powerful technique to determine the (un)certainty of the selectivity of any LC-MS/MS analysis and thus the risk of false positive results. Furthermore, the procedure is very useful as a tool to validate method selectivity.

  13. Palladium-Catalyzed Dehydrogenative Coupling: An Efficient Synthetic Strategy for the Construction of the Quinoline Core

    PubMed Central

    Carral-Menoyo, Asier; Ortiz-de-Elguea, Verónica; Martinez-Nunes, Mikel; Sotomayor, Nuria; Lete, Esther

    2017-01-01

    Palladium-catalyzed dehydrogenative coupling is an efficient synthetic strategy for the construction of quinoline scaffolds, a privileged structure and prevalent motif in many natural and biologically active products, in particular in marine alkaloids. Thus, quinolines and 1,2-dihydroquinolines can be selectively obtained in moderate-to-good yields via intramolecular C–H alkenylation reactions, by choosing the reaction conditions. This methodology provides a direct method for the construction of this type of quinoline through an efficient and atom economical procedure, and constitutes significant advance over the existing procedures that require preactivated reaction partners. PMID:28867803

  14. Using a fuzzy DEMATEL method for analyzing the factors influencing subcontractors selection

    NASA Astrophysics Data System (ADS)

    Kozik, Renata

    2016-06-01

    Subcontracting is a long-standing practice in the construction industry. This form of project organization, if manage properly, could provide the better quality, reduction in project time and costs. Subcontractors selection is a multi-criterion problem and can be determined by many factors. Identifying the importance of each of them as well as the direction of cause-effect relations between various types of factors can improve the management process. Their values could be evaluated on the basis of the available expert opinions with the application of a fuzzy multi-stage grading scale. In this paper it is recommended to use fuzzy DEMATEL method to analyze the relationship between factors affecting subcontractors selection.

  15. Nurses' adherence to the Kangaroo Care Method: support for nursing care management1

    PubMed Central

    da Silva, Laura Johanson; Leite, Josete Luzia; Scochi, Carmen Gracinda Silvan; da Silva, Leila Rangel; da Silva, Thiago Privado

    2015-01-01

    OBJECTIVE: construct an explanatory theoretical model about nurses' adherence to the Kangaroo Care Method at the Neonatal Intensive Care Unit, based on the meanings and interactions for care management. METHOD: qualitative research, based on the reference framework of the Grounded Theory. Eight nurses were interviewed at a Neonatal Intensive Care Unit in the city of Rio de Janeiro. The comparative analysis of the data comprised the phases of open, axial and selective coding. A theoretical conditional-causal model was constructed. RESULTS: four main categories emerged that composed the analytic paradigm: Giving one's best to the Kangaroo Method; Working with the complexity of the Kangaroo Method; Finding (de)motivation to apply the Kangaroo Method; and Facing the challenges for the adherence to and application of the Kangaroo Method. CONCLUSIONS: the central phenomenon revealed that each nurse and team professional has a role of multiplying values and practices that may or may not be constructive, potentially influencing the (dis)continuity of the Kangaroo Method at the Neonatal Intensive Care Unit. The findings can be used to outline management strategies that go beyond the courses and training and guarantee the strengthening of the care model. PMID:26155013

  16. Biology Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1984

    1984-01-01

    Describes mushroom growing as a school project, a method for illustrating need for carbon dioxide in photosynthesis, construction of a simple phytoplankton sampler, cell division activity using playing cards, blood separation activity, measurement of adaptation and selection pressures, computations in field ecology, and an activity demonstrating…

  17. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  18. A Diversity-Oriented Library of Fluorophore-Modified Receptors Constructed from a Chemical Library of Synthetic Fluorophores.

    PubMed

    Nakano, Shun; Tamura, Tomoki; Das, Raj Kumar; Nakata, Eiji; Chang, Young-Tae; Morii, Takashi

    2017-11-16

    The practical application of biosensors can be determined by evaluating the sensing ability of fluorophore-modified derivatives of a receptor with appropriate recognition characteristics for target molecules. One of the key determinants for successfully obtaining a useful biosensor is wide variation in the fluorophores attached to a given receptor. Thus, using a larger fluorophore-modified receptor library provides a higher probability of obtaining a practically useful biosensor. However, no effective method has yet been developed for constructing such a diverse library of fluorophore-modified receptors. Herein, we report a method for constructing fluorophore-modified receptors by using a chemical library of synthetic fluorophores with a thiol-reactive group. This library was converted into a library of fluorophore-modified adenosine-binding ribonucleopeptide (RNP) receptors by introducing the fluorophores to the Rev peptide of the RNP complex by alkylation of the thiol group. This method enabled the construction of 263 fluorophore-modified ATP-binding RNP receptors and allowed the selection of suitable receptor-based fluorescent sensors that target ATP. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. The Value of Measurement for Development of Nursing Knowledge:Underlying Philosophy, Contributions and Critiques.

    PubMed

    Durepos, Pamela; Orr, Elizabeth; Ploeg, Jenny; Kaasalainen, Sharon

    2018-06-26

    A philosophical discussion of constructive realism and measurement in the development of nursing knowledge is presented. Through Carper's four patterns of knowing, nurses come to know a person holistically. However, measurement as a source for nursing knowledge has been criticized for underlying positivism and reductionist approach to exploring reality. Which seems mal-alignment with person-centered care. Discussion paper. Constructive realism bridges positivism and constructivism, facilitating the measurement of physical and psychological phenomena. Reduction of complex phenomena and theoretical constructs into measurable properties is essential to building nursing's empiric knowledge and facilitates (rather than inhibits) person-knowing. Nurses should consider constructive realism as a philosophy to underpin their practice. This philosophy supports measurement as a primary method of inquiry in nursing research and clinical practice. Nurses can carefully select, and purposefully integrate, measurement tools with other methods of inquiry (such as qualitative research methods) to demonstrate the usefulness of nursing interventions and highlight nursing as a science. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  20. Informal worker phenomenon in housing construction project

    NASA Astrophysics Data System (ADS)

    Wijayaningtyas, Maranatha; Sipan, Ibrahim; Lukiyanto, Kukuh

    2017-11-01

    The informal workers phenomenon on housing construction projects in Indonesia is different from workers in other sectors who would always request as permanent employees. Substantively, the informal workers are disinclined to be bound as permanent employees which different from the general labor paradigm. Hence, the objective of this study is to find out how the labour selection process, the factors that affected their performance, and the suitable wage system to achieve the target completion of housing construction project. The qualitative method is used to uncover and understand the meaning behind the phenomena (numina) of informal workers action and their influence on housing construction project which called phenomenological approach. Five informal workers and two project managers were selected as informants based on predetermined criteria with in-depth interviews. The results showed that the informal worker were more satisfied with the wage based on unit price while working in the housing construction project for the flexibility in working hours. In addition, the developer was also relieved because they only control the quality and the achievement of the project completion time which supported by informal worker leader. Therefore, these findings are beneficial for both of developer and government as policy maker to succeed the housing program in Indonesia.

  1. Construction of PAH-degrading mixed microbial consortia by induced selection in soil.

    PubMed

    Zafra, German; Absalón, Ángel E; Anducho-Reyes, Miguel Ángel; Fernandez, Francisco J; Cortés-Espinosa, Diana V

    2017-04-01

    Bioremediation of polycyclic aromatic hydrocarbons (PAHs)-contaminated soils through the biostimulation and bioaugmentation processes can be a strategy for the clean-up of oil spills and environmental accidents. In this work, an induced microbial selection method using PAH-polluted soils was successfully used to construct two microbial consortia exhibiting high degradation levels of low and high molecular weight PAHs. Six fungal and seven bacterial native strains were used to construct mixed consortia with the ability to tolerate high amounts of phenanthrene (Phe), pyrene (Pyr) and benzo(a)pyrene (BaP) and utilize these compounds as a sole carbon source. In addition, we used two engineered PAH-degrading fungal strains producing heterologous ligninolytic enzymes. After a previous selection using microbial antagonism tests, the selection was performed in microcosm systems and monitored using PCR-DGGE, CO 2 evolution and PAH quantitation. The resulting consortia (i.e., C1 and C2) were able to degrade up to 92% of Phe, 64% of Pyr and 65% of BaP out of 1000 mg kg -1 of a mixture of Phe, Pyr and BaP (1:1:1) after a two-week incubation. The results indicate that constructed microbial consortia have high potential for soil bioremediation by bioaugmentation and biostimulation and may be effective for the treatment of sites polluted with PAHs due to their elevated tolerance to aromatic compounds, their capacity to utilize them as energy source. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Improvements to the Kunkel mutagenesis protocol for constructing primary and secondary phage-display libraries.

    PubMed

    Huang, Renhua; Fang, Pete; Kay, Brian K

    2012-09-01

    Site-directed mutagenesis is routinely performed in protein engineering experiments. One method, termed Kunkel mutagenesis, is frequently used for constructing libraries of peptide or protein variants in M13 bacteriophage, followed by affinity selection of phage particles. To make this method more efficient, the following two modifications were introduced: culture was incubated at 25°C for phage replication, which yielded two- to sevenfold more single-stranded DNA template compared to growth at 37°C, and restriction endonuclease recognition sites were used to remove non-recombinants. With both of the improvements, we could construct primary libraries of high complexity and that were 99-100% recombinant. Finally, with a third modification to the standard protocol of Kunkel mutagenesis, two secondary (mutagenic) libraries of a fibronectin type III (FN3) monobody were constructed with DNA segments that were amplified by error-prone and asymmetric PCR. Two advantages of this modification are that it bypasses the lengthy steps of restriction enzyme digestion and ligation, and that the pool of phage clones, recovered after affinity selection, can be used directly to generate a secondary library. Screening one of the two mutagenic libraries yielded variants that bound two- to fourfold tighter to human Pak1 kinase than the starting clone. The protocols described in this study should accelerate the discovery of phage-displayed recombinant affinity reagents. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Selecting foils for identification lineups: matching suspects or descriptions?

    PubMed

    Tunnicliff, J L; Clark, S E

    2000-04-01

    Two experiments directly compare two methods of selecting foils for identification lineups. The suspect-matched method selects foils based on their match to the suspect, whereas the description-matched method selects foils based on their match to the witness's description of the perpetrator. Theoretical analyses and previous results predict an advantage for description-matched lineups both in terms of correctly identifying the perpetrator and minimizing false identification of innocent suspects. The advantage for description-matched lineups should be particularly pronounced if the foils selected in suspect-matched lineups are too similar to the suspect. In Experiment 1, the lineups were created by trained police officers, and in Experiment 2, the lineups were constructed by undergraduate college students. The results of both experiments showed higher suspect-to-foil similarity for suspect-matched lineups than for description-matched lineups. However, neither experiment showed a difference in correct or false identification rates. Both experiments did, however, show that there may be an advantage for suspect-matched lineups in terms of no-pick and rejection responses. From these results, the endorsement of one method over the other seems premature.

  4. Simulation of tunneling construction methods of the Cisumdawu toll road

    NASA Astrophysics Data System (ADS)

    Abduh, Muhamad; Sukardi, Sapto Nugroho; Ola, Muhammad Rusdian La; Ariesty, Anita; Wirahadikusumah, Reini D.

    2017-11-01

    Simulation can be used as a tool for planning and analysis of a construction method. Using simulation technique, a contractor could design optimally resources associated with a construction method and compare to other methods based on several criteria, such as productivity, waste, and cost. This paper discusses the use of simulation using Norwegian Method of Tunneling (NMT) for a 472-meter tunneling work in the Cisumdawu Toll Road project. Primary and secondary data were collected to provide useful information for simulation as well as problems that may be faced by the contractor. The method was modelled using the CYCLONE and then simulated using the WebCYCLONE. The simulation could show the duration of the project from the duration model of each work tasks which based on literature review, machine productivity, and several assumptions. The results of simulation could also show the total cost of the project that was modeled based on journal construction & building unit cost and online websites of local and international suppliers. The analysis of the advantages and disadvantages of the method was conducted based on its, wastes, and cost. The simulation concluded the total cost of this operation is about Rp. 900,437,004,599 and the total duration of the tunneling operation is 653 days. The results of the simulation will be used for a recommendation to the contractor before the implementation of the already selected tunneling operation.

  5. Construction of an adaptable European transnational ecological deprivation index: the French version

    PubMed Central

    Delpierre, Cyrille; Dejardin, Olivier; Grosclaude, Pascale; Launay, Ludivine; Guittet, Lydia; Lang, Thierry; Launoy, Guy

    2012-01-01

    Background Studying social disparities in health implies the ability to measure them accurately, to compare them between different areas or countries and to follow trends over time. This study proposes a method for constructing a French European deprivation index, which will be replicable in several European countries and is related to an individual deprivation indicator constructed from a European survey specifically designed to study deprivation. Methods and Results Using individual data from the European Union Statistics on Income and Living Conditions survey, goods/services indicated by individuals as being fundamental needs, the lack of which reflect deprivation, were selected. From this definition, which is specific to a cultural context, an individual deprivation indicator was constructed by selecting fundamental needs associated both with objective and subjective poverty. Next, the authors selected among variables available both in the European Union Statistics on Income and Living Conditions survey and French national census those best reflecting individual experience of deprivation using multivariate logistic regression. An ecological measure of deprivation was provided for all the smallest French geographical units. Preliminary validation showed a higher association between the French European Deprivation Index (EDI) score and both income and education than the Townsend index, partly ensuring its ability to measure individual socioeconomic status. Conclusion This index, which is specific to a particular cultural and social policy context, could be replicated in 25 other European countries, thereby allowing European comparisons. EDI could also be reproducible over time. EDI could prove to be a relevant tool in evidence-based policy-making for measuring and reducing social disparities in health issues and even outside the medical domain. PMID:22544918

  6. Supplier selection based on complex indicator of finished products quality

    NASA Astrophysics Data System (ADS)

    Chernikova, Anna; Golovkina, Svetlana; Kuzmina, Svetlana; Demenchenok, Tatiana

    2017-10-01

    In the article the authors consider possible directions of solving problems when selecting a supplier for deliveries of raw materials and materials of an industrial enterprise, possible difficulties are analyzed and ways of their solution are suggested. Various methods are considered to improve the efficiency of the supplier selection process based on the analysis of the paper bags supplier selection process for the needs of the construction company. In the article the calculation of generalized indicators and complex indicator, which should include single indicators, formed in groups that reflect different aspects of quality, is presented.

  7. Selection of a rigid internal fixation construct for stabilization at the craniovertebral junction in pediatric patients.

    PubMed

    Anderson, Richard C E; Ragel, Brian T; Mocco, J; Bohman, Leif-Erik; Brockmeyer, Douglas L

    2007-07-01

    Atlantoaxial and occipitocervical instability in children have traditionally been treated with posterior bone and wire fusion and external halo orthoses. Recently, successful outcomes have been achieved using rigid internal fixation, particularly C1-2 transarticular screws. The authors describe flow diagrams created to help clinicians determine which method of internal fixation to use in complex anatomical circumstances when bilateral transarticular screw placement is not possible. The records of children who underwent either atlantoaxial or occipitocervical fixation with rigid internal fixation over an 11-year period were retrospectively reviewed to define flow diagrams used to determine treatment protocols. Among the 95 patients identified who underwent atlantoaxial or occipitocervical fixation, the craniocervical anatomy in 25 patients (six atlantoaxial and 19 occipitocervical fixations [26%]) required alternative methods of internal fixation. Types of screw fixation included loop or rod constructs anchored by combinations of C1-2 transarticular screws (15 constructs), C-1 lateral mass screws (11), C-2 pars screws (24), C-2 translaminar screws (one), and subaxial lateral mass screws (six). The mean age of the patients (15 boys and 10 girls) was 9.8 years (range 1.3-17 years). All 22 patients with greater than 3-month follow-up duration achieved solid bone fusion and maintained stable constructs on radiographic studies. Clinical improvement was seen in all patients who had preoperative symptoms. Novel flow diagrams are suggested to help guide selection of rigid internal fixation constructs when performing pediatric C1-2 and occipitocervical stabilizations. Use of these flow diagrams has led to successful fusion in 25 pediatric patients with difficult anatomy requiring less common constructs.

  8. Superwetting nanowire membranes for selective absorption.

    PubMed

    Yuan, Jikang; Liu, Xiaogang; Akbulut, Ozge; Hu, Junqing; Suib, Steven L; Kong, Jing; Stellacci, Francesco

    2008-06-01

    The construction of nanoporous membranes is of great technological importance for various applications, including catalyst supports, filters for biomolecule purification, environmental remediation and seawater desalination. A major challenge is the scalable fabrication of membranes with the desirable combination of good thermal stability, high selectivity and excellent recyclability. Here we present a self-assembly method for constructing thermally stable, free-standing nanowire membranes that exhibit controlled wetting behaviour ranging from superhydrophilic to superhydrophobic. These membranes can selectively absorb oils up to 20 times the material's weight in preference to water, through a combination of superhydrophobicity and capillary action. Moreover, the nanowires that form the membrane structure can be re-suspended in solutions and subsequently re-form the original paper-like morphology over many cycles. Our results suggest an innovative material that should find practical applications in the removal of organics, particularly in the field of oil spill cleanup.

  9. Scoring Method of a Situational Judgment Test: Influence on Internal Consistency Reliability, Adverse Impact and Correlation with Personality?

    ERIC Educational Resources Information Center

    De Leng, W. E.; Stegers-Jager, K. M.; Husbands, A.; Dowell, J. S.; Born, M. Ph.; Themmen, A. P.

    2017-01-01

    Situational Judgment Tests (SJTs) are increasingly used for medical school selection. Scoring an SJT is more complicated than scoring a knowledge test, because there are no objectively correct answers. The scoring method of an SJT may influence the construct and concurrent validity and the adverse impact with respect to non-traditional students.…

  10. Development of a Decision Model for Selection of Appropriate Timely Delivery Techniques for Highway Projects

    DOT National Transportation Integrated Search

    2009-04-01

    "The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice...

  11. Site selection for managed aquifer recharge using fuzzy rules: integrating geographical information system (GIS) tools and multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Malekmohammadi, Bahram; Ramezani Mehrian, Majid; Jafari, Hamid Reza

    2012-11-01

    One of the most important water-resources management strategies for arid lands is managed aquifer recharge (MAR). In establishing a MAR scheme, site selection is the prime prerequisite that can be assisted by geographic information system (GIS) tools. One of the most important uncertainties in the site-selection process using GIS is finite ranges or intervals resulting from data classification. In order to reduce these uncertainties, a novel method has been developed involving the integration of multi-criteria decision making (MCDM), GIS, and a fuzzy inference system (FIS). The Shemil-Ashkara plain in the Hormozgan Province of Iran was selected as the case study; slope, geology, groundwater depth, potential for runoff, land use, and groundwater electrical conductivity have been considered as site-selection factors. By defining fuzzy membership functions for the input layers and the output layer, and by constructing fuzzy rules, a FIS has been developed. Comparison of the results produced by the proposed method and the traditional simple additive weighted (SAW) method shows that the proposed method yields more precise results. In conclusion, fuzzy-set theory can be an effective method to overcome associated uncertainties in classification of geographic information data.

  12. An improved method for constructing and selectively silanizing double-barreled, neutral liquid-carrier, ion-selective microelectrodes

    PubMed Central

    Deveau, Jason S.T.; Grodzinski, Bernard

    2005-01-01

    We describe an improved, efficient and reliable method for the vapour-phase silanization of multi-barreled, ion-selective microelectrodes of which the silanized barrel(s) are to be filled with neutral liquid ion-exchanger (LIX). The technique employs a metal manifold to exclusively and simultaneously deliver dimethyldichlorosilane to only the ion-selective barrels of several multi-barreled microelectrodes. Compared to previously published methods the technique requires fewer procedural steps, less handling of individual microelectrodes, improved reproducibility of silanization of the selected microelectrode barrels and employs standard borosilicate tubing rather than the less-conventional theta-type glass. The electrodes remain stable for up to 3 weeks after the silanization procedure. The efficacy of a double-barreled electrode containing a proton ionophore in the ion-selective barrel is demonstrated in situ in the leaf apoplasm of pea (Pisum) and sunflower (Helianthus). Individual leaves were penetrated to depth of ~150 μm through the abaxial surface. Microelectrode readings remained stable after multiple impalements without the need for a stabilizing PVC matrix. PMID:16136222

  13. Method of Remotely Constructing a Room

    DOEpatents

    Michie, J. D.; De Hart, R. C.

    1971-10-05

    The testing of nuclear devices of high explosive yield has required that cavities of relatively large size be provided at considerable distances below the surface of the earth for the pre-detonation emplacement of the device. The construction of an essentially watertight chamber or room in the cavity is generally required for the actual emplacement of the device. A method is described of constructing such a room deep within the earth by personnel at the surface. A dual wall bladder of a watertight, pliable fabric material is lowered down a shaft into a selected position. The bladder is filled with a concrete grout while a heavy fluid having essentially the same density as the grout is maintained on both sides of the bladder, to facilitate complete deployment of the bladder by the grout to form a room of desired configuration. (10 claims)

  14. Method of remotely constructing a room

    DOEpatents

    Michie, J.D.; De Hart, R.C.

    1971-10-05

    The testing of nuclear devices of high explosive yield has required that cavities of relatively large size be provided at considerable distances below the surface of the earth for the pre-detonation emplacement of the device. The construction of an essentially watertight chamber or room in the cavity is generally required for the actual emplacement of the device. A method is described of constructing such a room deep within the earth by personnel at the surface. A dual wall bladder of a watertight, pliable fabric material is lowered down a shaft into a selected position. The bladder is filled with a concrete grout while a heavy fluid having essentially the same density as the grout is maintained on both sides of the bladder, to facilitate complete deployment of the bladder by the grout to form a room of desired configuration. (10 claims)

  15. A Review of International Space Station Habitable Element Equipment Offgassing Characteristics

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.

    2010-01-01

    Crewed spacecraft trace contaminant control employs both passive and active methods to achieve acceptable cabin atmospheric quality. Passive methods include carefully selecting materials of construction, employing clean manufacturing practices, and minimizing systems and payload operational impacts to the cabin environment. Materials selection and manufacturing processes constitute the first level of equipment offgassing control. An element-level equipment offgassing test provides preflight verification that passive controls have been successful. Offgassing test results from multiple International Space Station (ISS) habitable elements and cargo vehicles are summarized and implications for active contamination control equipment design are discussed

  16. Ray tracing method for simulation of laser beam interaction with random packings of powders

    NASA Astrophysics Data System (ADS)

    Kovalev, O. B.; Kovaleva, I. O.; Belyaev, V. V.

    2018-03-01

    Selective laser sintering is a technology of rapid manufacturing of a free form that is created as a solid object by selectively fusing successive layers of powder using a laser. The motivation of this study is due to the currently insufficient understanding of the processes and phenomena of selective laser melting of powders whose time scales differ by orders of magnitude. To construct random packings from mono- and polydispersed solid spheres, the algorithm of their generation based on the discrete element method is used. A numerical method of ray tracing is proposed that is used to simulate the interaction of laser radiation with a random bulk packing of spherical particles and to predict the optical properties of the granular layer, the extinction and absorption coefficients, depending on the optical properties of a powder material.

  17. 49 CFR 268.21 - Down-selection of one or more Maglev projects for further study and selection of one project for...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... further study and selection of one project for final design, engineering, and construction funding. 268.21... and selection of one project for final design, engineering, and construction funding. (a) Upon... analyses necessary prior to initiation of construction. Final design and engineering work will also be...

  18. 49 CFR 268.21 - Down-selection of one or more Maglev projects for further study and selection of one project for...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... further study and selection of one project for final design, engineering, and construction funding. 268.21... and selection of one project for final design, engineering, and construction funding. (a) Upon... analyses necessary prior to initiation of construction. Final design and engineering work will also be...

  19. 49 CFR 268.21 - Down-selection of one or more Maglev projects for further study and selection of one project for...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... further study and selection of one project for final design, engineering, and construction funding. 268.21... and selection of one project for final design, engineering, and construction funding. (a) Upon... analyses necessary prior to initiation of construction. Final design and engineering work will also be...

  20. 49 CFR 268.21 - Down-selection of one or more Maglev projects for further study and selection of one project for...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... further study and selection of one project for final design, engineering, and construction funding. 268.21... and selection of one project for final design, engineering, and construction funding. (a) Upon... analyses necessary prior to initiation of construction. Final design and engineering work will also be...

  1. 49 CFR 268.21 - Down-selection of one or more Maglev projects for further study and selection of one project for...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... further study and selection of one project for final design, engineering, and construction funding. 268.21... and selection of one project for final design, engineering, and construction funding. (a) Upon... analyses necessary prior to initiation of construction. Final design and engineering work will also be...

  2. Adaptation and niche construction in human prehistory: a case study from the southern Scandinavian Late Glacial

    PubMed Central

    Riede, Felix

    2011-01-01

    The niche construction model postulates that human bio-social evolution is composed of three inheritance domains, genetic, cultural and ecological, linked by feedback selection. This paper argues that many kinds of archaeological data can serve as proxies for human niche construction processes, and presents a method for investigating specific niche construction hypotheses. To illustrate this method, the repeated emergence of specialized reindeer (Rangifer tarandus) hunting/herding economies during the Late Palaeolithic (ca 14.7–11.5 kyr BP) in southern Scandinavia is analysed from a niche construction/triple-inheritance perspective. This economic relationship resulted in the eventual domestication of Rangifer. The hypothesis of whether domestication was achieved as early as the Late Palaeolithic, and whether this required the use of domesticated dogs (Canis familiaris) as hunting, herding or transport aids, is tested via a comparative analysis using material culture-based phylogenies and ecological datasets in relation to demographic/genetic proxies. Only weak evidence for sustained niche construction behaviours by prehistoric hunter–gatherer in southern Scandinavia is found, but this study nonetheless provides interesting insights into the likely processes of dog and reindeer domestication, and into processes of adaptation in Late Glacial foragers. PMID:21320895

  3. Adaptation and niche construction in human prehistory: a case study from the southern Scandinavian Late Glacial.

    PubMed

    Riede, Felix

    2011-03-27

    The niche construction model postulates that human bio-social evolution is composed of three inheritance domains, genetic, cultural and ecological, linked by feedback selection. This paper argues that many kinds of archaeological data can serve as proxies for human niche construction processes, and presents a method for investigating specific niche construction hypotheses. To illustrate this method, the repeated emergence of specialized reindeer (Rangifer tarandus) hunting/herding economies during the Late Palaeolithic (ca 14.7-11.5 kyr BP) in southern Scandinavia is analysed from a niche construction/triple-inheritance perspective. This economic relationship resulted in the eventual domestication of Rangifer. The hypothesis of whether domestication was achieved as early as the Late Palaeolithic, and whether this required the use of domesticated dogs (Canis familiaris) as hunting, herding or transport aids, is tested via a comparative analysis using material culture-based phylogenies and ecological datasets in relation to demographic/genetic proxies. Only weak evidence for sustained niche construction behaviours by prehistoric hunter-gatherer in southern Scandinavia is found, but this study nonetheless provides interesting insights into the likely processes of dog and reindeer domestication, and into processes of adaptation in Late Glacial foragers.

  4. Encoded Library Synthesis Using Chemical Ligation and the Discovery of sEH Inhibitors from a 334-Million Member Library

    NASA Astrophysics Data System (ADS)

    Litovchick, Alexander; Dumelin, Christoph E.; Habeshian, Sevan; Gikunju, Diana; Guié, Marie-Aude; Centrella, Paolo; Zhang, Ying; Sigel, Eric A.; Cuozzo, John W.; Keefe, Anthony D.; Clark, Matthew A.

    2015-06-01

    A chemical ligation method for construction of DNA-encoded small-molecule libraries has been developed. Taking advantage of the ability of the Klenow fragment of DNA polymerase to accept templates with triazole linkages in place of phosphodiesters, we have designed a strategy for chemically ligating oligonucleotide tags using cycloaddition chemistry. We have utilized this strategy in the construction and selection of a small molecule library, and successfully identified inhibitors of the enzyme soluble epoxide hydrolase.

  5. [Construction and characterization of a selective membrane electrode for tenoxicam determination].

    PubMed

    Murăraşu, Andreea Elena; Mândrescu, Mariana; Spac, A F; Dorneanu, V

    2010-01-01

    This paper describes the construction and characterization of a selective membrane electrode which can be used for determination of tenoxicam. The electroactive compound is a precipitate obtained in 2 N hydrocloric acid solution containing tenoxicam in which a solution of iodine is added. The membrane is made by mixing the electroactive compound with polyethylene using tetrahydrofurane as solvent. The solution is evaporated in order to obtain a thick membrane, which is attached at one end of a PVC tube and is fixed with the same polymeric solution. In this tube an internal Ag/AgCl reference electrode is inserted. The assembly is filled with an internal solution containing tenoxicam. The electrode was characterized (electrode slope, selectivity, optimal pH range, response time, life time). The developed method was validated. The method showed a good liniarity in the range of 10(-6)-10(-1) M (the correlation coefficient r = 0.9999). The detection limit (LD) was 7.347 x 10(-7) M and the quantification limit (LQ) was 1.017 x 10(-6) M. There were established the precision (RSD = 1.79%) and the accuracy (mean recovery is 100.17%) The experimental results demonstrated a good sensibility.

  6. Research on the Purification Effect of Aquatic Plants Based on Grey Clustering Method

    NASA Astrophysics Data System (ADS)

    Gu, Sudan; Du, Fuhui

    2018-01-01

    This paper uses the grey clustering method to evaluate the water quality level of the MingGuan constructed wetland at the import and export of artificial wetlands. Constructed wetland of Ming Guanis established on the basis of the Fuyang River’s water quality improvement, to choose suitable aquatic plants, in order to achieve the Fuyang River water purification effect. Namely TP, TN, NH3-N, DO, COD and COMMn and permanganate index are selected as clustering indicators. Water quality is divided into five grades according to the Surface Water Environmental Quality Standard (GB3838-2002) as the evaluation standard. In order to select the suitable wetland plants, the purification effect of 6 kinds of higher aquatic plants on the sewage of fuyang river was tested. one kind of plants was selected: Typha. The results show that the water quality of the section is gradually changed from V water quality to III water quality. After tartificial wetland of cycle for a long time, Typha has good purification effect. In November, water quality categories are basically concentrated in the VI, V class, may be caused by chemical decomposition of aquatic plants, should strengthen the academic research.

  7. In Silico Syndrome Prediction for Coronary Artery Disease in Traditional Chinese Medicine

    PubMed Central

    Lu, Peng; Chen, Jianxin; Zhao, Huihui; Gao, Yibo; Luo, Liangtao; Zuo, Xiaohan; Shi, Qi; Yang, Yiping; Yi, Jianqiang; Wang, Wei

    2012-01-01

    Coronary artery disease (CAD) is the leading causes of deaths in the world. The differentiation of syndrome (ZHENG) is the criterion of diagnosis and therapeutic in TCM. Therefore, syndrome prediction in silico can be improving the performance of treatment. In this paper, we present a Bayesian network framework to construct a high-confidence syndrome predictor based on the optimum subset, that is, collected by Support Vector Machine (SVM) feature selection. Syndrome of CAD can be divided into asthenia and sthenia syndromes. According to the hierarchical characteristics of syndrome, we firstly label every case three types of syndrome (asthenia, sthenia, or both) to solve several syndromes with some patients. On basis of the three syndromes' classes, we design SVM feature selection to achieve the optimum symptom subset and compare this subset with Markov blanket feature select using ROC. Using this subset, the six predictors of CAD's syndrome are constructed by the Bayesian network technique. We also design Naïve Bayes, C4.5 Logistic, Radial basis function (RBF) network compared with Bayesian network. In a conclusion, the Bayesian network method based on the optimum symptoms shows a practical method to predict six syndromes of CAD in TCM. PMID:22567030

  8. Response Surface Model Building Using Orthogonal Arrays for Computer Experiments

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Braun, Robert D.; Moore, Arlene A.; Lepsch, Roger A.

    1997-01-01

    This study investigates response surface methods for computer experiments and discusses some of the approaches available. Orthogonal arrays constructed for computer experiments are studied and an example application to a technology selection and optimization study for a reusable launch vehicle is presented.

  9. Can One Portfolio Measure the Six ACGME General Competencies?

    ERIC Educational Resources Information Center

    Jarvis, Robert M.; O'Sullivan, Patricia S.; McClain, Tina; Clardy, James A.

    2004-01-01

    Objective: To determine that portfolios, useable by any program, can provide needed evidence of resident performance within the ACGME general competencies. Methods: Eighteen residents constructed portfolios with selected entries from thirteen psychiatric skills. Two raters assessed whether entries reflected resident performance within the general…

  10. ANFIS multi criteria decision making for overseas construction projects: a methodology

    NASA Astrophysics Data System (ADS)

    Utama, W. P.; Chan, A. P. C.; Zulherman; Zahoor, H.; Gao, R.; Jumas, D. Y.

    2018-02-01

    A critical part when a company targeting a foreign market is how to make a better decision in connection with potential project selection. Since different attributes of information are often incomplete, imprecise and ill-defined in overseas projects selection, the process of decision making by relying on the experiences and intuition is a risky attitude. This paper aims to demonstrate a decision support method in deciding overseas construction projects (OCPs). An Adaptive Neuro-Fuzzy Inference System (ANFIS), the amalgamation of Neural Network and Fuzzy Theory, was used as decision support tool to decide to go or not go on OCPs. Root mean square error (RMSE) and coefficient of correlation (R) were employed to identify the ANFIS system indicating an optimum and efficient result. The optimum result was obtained from ANFIS network with two input membership functions, Gaussian membership function (gaussmf) and hybrid optimization method. The result shows that ANFIS may help the decision-making process for go/not go decision in OCPs.

  11. TANK OPERATIONS CONTRACT CONSTRUCTION MANAGEMENT METHODOLOGY UTILIZING THE AGENCY METHOD OF CONSTRUCTION MANAGEMENT TO SAFELY AND EFFECTIVELY COMPLETE NUCLEAR CONSTRUCTION WORK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LESO KF; HAMILTON HM; FARNER M

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business construction subcontractors while performing high hazard work in a safe and productive manner. Previous to the Washington River Protection Solutions, LLC contract, Construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipmentmore » and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper describes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method. This method was implemented in the first quarter of Fiscal Year (FY) 2009, where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are subcontracted directly by WRPS to small or disadvantaged contractors that are mentored and supported by DRS personnel. Each small contractor is mentored and supported utilizing the principles of the Construction Industry Institute (CII) Partnering process. Some of the key mentoring and partnering areas that are explored in this paper are, internal and external safety professional support, subcontractor safety teams and the interface with project and site safety teams, quality assurance program support to facilitate compliance with NQA-1, construction, team roles and responsibilities, work definition for successful fixed price contracts, scheduling and interface with project schedules and cost projection/accruals. The practical application of the CII Partnering principles, with the Construction Management expertise of URS, has led to a highly successful construction model that also meets small business contracting goals.« less

  12. An improved principal component analysis based region matching method for fringe direction estimation

    NASA Astrophysics Data System (ADS)

    He, A.; Quan, C.

    2018-04-01

    The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.

  13. Linear reduction methods for tag SNP selection.

    PubMed

    He, Jingwu; Zelikovsky, Alex

    2004-01-01

    It is widely hoped that constructing a complete human haplotype map will help to associate complex diseases with certain SNP's. Unfortunately, the number of SNP's is huge and it is very costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNP's that should be sequenced to considerably small number of informative representatives, so called tag SNP's. In this paper, we propose a new linear algebra based method for selecting and using tag SNP's. Our method is purely combinatorial and can be combined with linkage disequilibrium (LD) and block based methods. We measure the quality of our tag SNP selection algorithm by comparing actual SNP's with SNP's linearly predicted from linearly chosen tag SNP's. We obtain an extremely good compression and prediction rates. For example, for long haplotypes (>25000 SNP's), knowing only 0.4% of all SNP's we predict the entire unknown haplotype with 2% accuracy while the prediction method is based on a 10% sample of the population.

  14. An unsupervised technique for optimal feature selection in attribute profiles for spectral-spatial classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Kaushal; Patra, Swarnajyoti

    2018-04-01

    Inclusion of spatial information along with spectral features play a significant role in classification of remote sensing images. Attribute profiles have already proved their ability to represent spatial information. In order to incorporate proper spatial information, multiple attributes are required and for each attribute large profiles need to be constructed by varying the filter parameter values within a wide range. Thus, the constructed profiles that represent spectral-spatial information of an hyperspectral image have huge dimension which leads to Hughes phenomenon and increases computational burden. To mitigate these problems, this work presents an unsupervised feature selection technique that selects a subset of filtered image from the constructed high dimensional multi-attribute profile which are sufficiently informative to discriminate well among classes. In this regard the proposed technique exploits genetic algorithms (GAs). The fitness function of GAs are defined in an unsupervised way with the help of mutual information. The effectiveness of the proposed technique is assessed using one-against-all support vector machine classifier. The experiments conducted on three hyperspectral data sets show the robustness of the proposed method in terms of computation time and classification accuracy.

  15. Determination of selected neurotoxic insecticides in small amounts of animal tissue utilizing a newly constructed mini-extractor.

    PubMed

    Seifertová, Marta; Čechová, Eliška; Llansola, Marta; Felipo, Vicente; Vykoukalová, Martina; Kočan, Anton

    2017-10-01

    We developed a simple analytical method for the simultaneous determination of representatives of various groups of neurotoxic insecticides (carbaryl, chlorpyrifos, cypermethrin, and α-endosulfan and β-endosulfan and their metabolite endosulfan sulfate) in limited amounts of animal tissues containing different amounts of lipids. Selected tissues (rodent fat, liver, and brain) were extracted in a special in-house-designed mini-extractor constructed on the basis of the Soxhlet and Twisselmann extractors. A dried tissue sample placed in a small cartridge was extracted, while the nascent extract was simultaneously filtered through a layer of sodium sulfate. The extraction was followed by combined clean-up, including gel permeation chromatography (in case of high lipid content), ultrasonication, and solid-phase extraction chromatography using C 18 on silica and aluminum oxide. Gas chromatography coupled with high-resolution mass spectrometry was used for analyte separation, detection, and quantification. Average recoveries for individual insecticides ranged from 82 to 111%. Expanded measurement uncertainties were generally lower than 35%. The developed method was successfully applied to rat tissue samples obtained from an animal model dealing with insecticide exposure during brain development. This method may also be applied to the analytical treatment of small amounts of various types of animal and human tissue samples. A significant advantage achieved using this method is high sample throughput due to the simultaneous treatment of many samples. Graphical abstract Optimized workflow for the determination of selected insecticides in small amounts of animal tissue including newly developed mini-extractor.

  16. Methodology for developing and evaluating the PROMIS smoking item banks.

    PubMed

    Hansen, Mark; Cai, Li; Stucky, Brian D; Tucker, Joan S; Shadel, William G; Edelen, Maria Orlando

    2014-09-01

    This article describes the procedures used in the PROMIS Smoking Initiative for the development and evaluation of item banks, short forms (SFs), and computerized adaptive tests (CATs) for the assessment of 6 constructs related to cigarette smoking: nicotine dependence, coping expectancies, emotional and sensory expectancies, health expectancies, psychosocial expectancies, and social motivations for smoking. Analyses were conducted using response data from a large national sample of smokers. Items related to each construct were subjected to extensive item factor analyses and evaluation of differential item functioning (DIF). Final item banks were calibrated, and SF assessments were developed for each construct. The performance of the SFs and the potential use of the item banks for CAT administration were examined through simulation study. Item selection based on dimensionality assessment and DIF analyses produced item banks that were essentially unidimensional in structure and free of bias. Simulation studies demonstrated that the constructs could be accurately measured with a relatively small number of carefully selected items, either through fixed SFs or CAT-based assessment. Illustrative results are presented, and subsequent articles provide detailed discussion of each item bank in turn. The development of the PROMIS smoking item banks provides researchers with new tools for measuring smoking-related constructs. The use of the calibrated item banks and suggested SF assessments will enhance the quality of score estimates, thus advancing smoking research. Moreover, the methods used in the current study, including innovative approaches to item selection and SF construction, may have general relevance to item bank development and evaluation. © The Author 2013. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Molecule nanoweaver

    DOEpatents

    Gerald, II; Rex, E [Brookfield, IL; Klingler, Robert J [Glenview, IL; Rathke, Jerome W [Homer Glen, IL; Diaz, Rocio [Chicago, IL; Vukovic, Lela [Westchester, IL

    2009-03-10

    A method, apparatus, and system for constructing uniform macroscopic films with tailored geometric assemblies of molecules on the nanometer scale. The method, apparatus, and system include providing starting molecules of selected character, applying one or more force fields to the molecules to cause them to order and condense with NMR spectra and images being used to monitor progress in creating the desired geometrical assembly and functionality of molecules that comprise the films.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oshima, Masumi; Kin, Tadahiro; Kimura, Atsushi

    Multi-step cascades from the {sup 62}Ni(n{sub cold},{gamma}) {sup 63}Ni reaction were studied via a {gamma}-ray spectroscopy method. With a {gamma}-ray detector array multiple {gamma}-ray coincident events were accumulated. By selecting full cascade events from the capture state to the ground state, we have developed a new computer-based level construction method and it is applied to excited level assignment in {sup 63}Ni.

  19. Simulation Experiment on Landing Site Selection Using a Simple Geometric Approach

    NASA Astrophysics Data System (ADS)

    Zhao, W.; Tong, X.; Xie, H.; Jin, Y.; Liu, S.; Wu, D.; Liu, X.; Guo, L.; Zhou, Q.

    2017-07-01

    Safe landing is an important part of the planetary exploration mission. Even fine scale terrain hazards (such as rocks, small craters, steep slopes, which would not be accurately detected from orbital reconnaissance) could also pose a serious risk on planetary lander or rover and scientific instruments on-board it. In this paper, a simple geometric approach on planetary landing hazard detection and safe landing site selection is proposed. In order to achieve full implementation of this algorithm, two easy-to-compute metrics are presented for extracting the terrain slope and roughness information. Unlike conventional methods which must do the robust plane fitting and elevation interpolation for DEM generation, in this work, hazards is identified through the processing directly on LiDAR point cloud. For safe landing site selection, a Generalized Voronoi Diagram is constructed. Based on the idea of maximum empty circle, the safest landing site can be determined. In this algorithm, hazards are treated as general polygons, without special simplification (e.g. regarding hazards as discrete circles or ellipses). So using the aforementioned method to process hazards is more conforming to the real planetary exploration scenario. For validating the approach mentioned above, a simulated planetary terrain model was constructed using volcanic ash with rocks in indoor environment. A commercial laser scanner mounted on a rail was used to scan the terrain surface at different hanging positions. The results demonstrate that fairly hazard detection capability and reasonable site selection was obtained compared with conventional method, yet less computational time and less memory usage was consumed. Hence, it is a feasible candidate approach for future precision landing selection on planetary surface.

  20. Selection Input Output by Restriction Using DEA Models Based on a Fuzzy Delphi Approach and Expert Information

    NASA Astrophysics Data System (ADS)

    Arsad, Roslah; Nasir Abdullah, Mohammad; Alias, Suriana; Isa, Zaidi

    2017-09-01

    Stock evaluation has always been an interesting problem for investors. In this paper, a comparison regarding the efficiency stocks of listed companies in Bursa Malaysia were made through the application of estimation method of Data Envelopment Analysis (DEA). One of the interesting research subjects in DEA is the selection of appropriate input and output parameter. In this study, DEA was used to measure efficiency of stocks of listed companies in Bursa Malaysia in terms of the financial ratio to evaluate performance of stocks. Based on previous studies and Fuzzy Delphi Method (FDM), the most important financial ratio was selected. The results indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share, price to earnings and debt to equity were the most important ratios. Using expert information, all the parameter were clarified as inputs and outputs. The main objectives were to identify most critical financial ratio, clarify them based on expert information and compute the relative efficiency scores of stocks as well as rank them in the construction industry and material completely. The methods of analysis using Alirezaee and Afsharian’s model were employed in this study, where the originality of Charnes, Cooper and Rhodes (CCR) with the assumption of Constant Return to Scale (CSR) still holds. This method of ranking relative efficiency of decision making units (DMUs) was value-added by the Balance Index. The interested data was made for year 2015 and the population of the research includes accepted companies in stock markets in the construction industry and material (63 companies). According to the ranking, the proposed model can rank completely for 63 companies using selected financial ratio.

  1. Selection of a new Mycobacterium tuberculosis H37Rv aptamer and its application in the construction of a SWCNT/aptamer/Au-IDE MSPQC H37Rv sensor.

    PubMed

    Zhang, XiaoQing; Feng, Ye; Yao, QiongQiong; He, Fengjiao

    2017-12-15

    A rapid and accurate detection method for Mycobacterium tuberculosis (M. tuberculosis) is essential for effectively treating tuberculosis. However, current detection methods cannot meet these clinical requirements because the methods are slow or of low specificity. Consequently, a new highly specific ssDNA aptamer against M. tuberculosis reference strain H37Rv was selected by using the whole-cell systematic evolution of ligands by exponential enrichment technique. The selected aptamer was used to construct a fast and highly specific H37Rv sensor. The probe was produced by immobilizing thiol-modified aptamer on an Au interdigital electrode (Au-IDE) of a multichannel series piezoelectric quartz crystal (MSPQC) through Au-S bonding, and then single-walled carbon nanotubes (SWCNTs) were bonded on the aptamer by π-π stacking. SWCNTs were used as a signal indicator because of their considerable difference in conductivity compared with H37Rv. When H37Rv is present, it replaces the SWCNTs because it binds to the aptamer much more strongly than SWCNTs do. The replacement of SWCNTs by H37Rv resulted in a large change in the electrical properties, and this change was detected by the MSPQC. The proposed sensor is highly selective and can distinguish H37Rv from Mycobacterium smegmatis (M. smegmatis) and Bacillus Calmette-Guerin vaccine (BCG). The detection time was 70min and the detection limit was 100cfu/mL. Compared with conventional methods, this new SWCNT/aptamer/Au-IDE MSPQC H37Rv sensor was specific, rapid, and sensitive, and it holds great potential for the early detection of H37Rv in clinical diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. TANK OPERATIONS CONTRACT CONSTRUCTION MANAGEMENT METHODOLOGY UTILIZING THE AGENCY METHOD OF CONSTRUCTION MANAGEMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LESKO KF; BERRIOCHOA MV

    2010-02-26

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business constructioin subcontractors while performing high hazard work in a safe and productive manner. Previous to the WRPS contract, construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installationmore » of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper descirbes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method (John E Schaufelberger, Len Holm, "Management of Construction Projects, A Constructor's Perspective", University of Washington, Prentice Hall 2002). This method was implemented in the first quarter of Fiscal Year 2009 (FY2009), where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are subcontracted directly by WRPS to small or disadvantaged contractors that are mentored and supported by URS personnel. Each small contractor is mentored and supported utilizing the principles of the Construction Industry Institute (CII) Partnering process. Some of the key mentoring and partnering areas that are explored in this paper are, internal and external safety professional support, subcontractor safety teams and the interface with project and site safety teams, quality assurance program support to facilitate compliance with NQA-1, construction, team roles and responsibilities, work definition for successful fixed price contracts, scheduling and interface with project schedules and cost projection/accruals. The practical application of the CII Partnering principles, with the Construction Management expertise of URS, has led to a highly successful construction model that also meets small business contracting goals.« less

  3. Why developmental niche construction is not selective niche construction: and why it matters.

    PubMed

    Stotz, Karola

    2017-10-06

    In the last decade, niche construction has been heralded as the neglected process in evolution. But niche construction is just one way in which the organism's interaction with and construction of the environment can have potential evolutionary significance. The constructed environment does not just select for , it also produces new variation. Nearly 3 decades ago, and in parallel with Odling-Smee's article 'Niche-constructing phenotypes', West and King introduced the 'ontogenetic niche' to give the phenomena of exo genetic inheritance a formal name. Since then, a range of fields in the life sciences and medicine has amassed evidence that parents influence their offspring by means other than DNA (parental effects), and proposed mechanisms for how heritable variation can be environmentally induced and developmentally regulated. The concept of 'developmental niche construction' (DNC) elucidates how a diverse range of mechanisms contributes to the transgenerational transfer of developmental resources. My most central of claims is that whereas the selective niche of niche construction theory is primarily used to explain the active role of the organism in its selective environment, DNC is meant to indicate the active role of the organism in its developmental environment. The paper highlights the differences between the construction of the selective and the developmental niche, and explores the overall significance of DNC for evolutionary theory.

  4. Development of a decision model for selection of appropriate timely delivery techniques for highway projects : final report, April 2009.

    DOT National Transportation Integrated Search

    2009-04-01

    The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice,...

  5. Correlation of rapid hydrometer analysis for select material to existing procedure LDH-TR-407-66 : final report.

    DOT National Transportation Integrated Search

    1968-05-01

    Conditions arise during construction of bases with Portland cement stabilized soils which require close programming of work. Therefore, time is of significant importance. : That is the objective of this report; to evaluate a method by which considera...

  6. Approximate Genealogies Under Genetic Hitchhiking

    PubMed Central

    Pfaffelhuber, P.; Haubold, B.; Wakolbinger, A.

    2006-01-01

    The rapid fixation of an advantageous allele leads to a reduction in linked neutral variation around the target of selection. The genealogy at a neutral locus in such a selective sweep can be simulated by first generating a random path of the advantageous allele's frequency and then a structured coalescent in this background. Usually the frequency path is approximated by a logistic growth curve. We discuss an alternative method that approximates the genealogy by a random binary splitting tree, a so-called Yule tree that does not require first constructing a frequency path. Compared to the coalescent in a logistic background, this method gives a slightly better approximation for identity by descent during the selective phase and a much better approximation for the number of lineages that stem from the founder of the selective sweep. In applications such as the approximation of the distribution of Tajima's D, the two approximation methods perform equally well. For relevant parameter ranges, the Yule approximation is faster. PMID:17182733

  7. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  8. Social cost impact assessment of pipeline infrastructure projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, John C., E-mail: matthewsj@battelle.org; Allouche, Erez N., E-mail: allouche@latech.edu; Sterling, Raymond L., E-mail: sterling@latech.edu

    A key advantage of trenchless construction methods compared with traditional open-cut methods is their ability to install or rehabilitate underground utility systems with limited disruption to the surrounding built and natural environments. The equivalent monetary values of these disruptions are commonly called social costs. Social costs are often ignored by engineers or project managers during project planning and design phases, partially because they cannot be calculated using standard estimating methods. In recent years some approaches for estimating social costs were presented. Nevertheless, the cost data needed for validation of these estimating methods is lacking. Development of such social cost databasesmore » can be accomplished by compiling relevant information reported in various case histories. This paper identifies eight most important social cost categories, presents mathematical methods for calculating them, and summarizes the social cost impacts for two pipeline construction projects. The case histories are analyzed in order to identify trends for the various social cost categories. The effectiveness of the methods used to estimate these values is also discussed. These findings are valuable for pipeline infrastructure engineers making renewal technology selection decisions by providing a more accurate process for the assessment of social costs and impacts. - Highlights: • Identified the eight most important social cost factors for pipeline construction • Presented mathematical methods for calculating those social cost factors • Summarized social cost impacts for two pipeline construction projects • Analyzed those projects to identify trends for the social cost factors.« less

  9. Construction of an evaluation and selection system of emergency treatment technology based on dynamic fuzzy GRA method for phenol spill

    NASA Astrophysics Data System (ADS)

    Zhao, Jingjing; Yu, Lean; Li, Lian

    2017-05-01

    There is often a great deal of complexity, fuzziness and uncertainties of the chemical contingency spills. In order to obtain the optimum emergency disposal technology schemes as soon as the chemical pollution accident occurs, the technique evaluation system was developed based on dynamic fuzzy GRA method, and the feasibility of the proposed methods has been tested by using a emergency phenol spill accidence occurred in highway.

  10. Apparatus and method for creating a photonic densely-accumulated ray-point

    NASA Technical Reports Server (NTRS)

    Park, Yeonjoon (Inventor); Choi, Sang H. (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor)

    2012-01-01

    An optical apparatus includes an optical diffraction device configured for diffracting a predetermined wavelength of incident light onto adjacent optical focal points, and a photon detector for detecting a spectral characteristic of the predetermined wavelength. One of the optical focal points is a constructive interference point and the other optical focal point is a destructive interference point. The diffraction device, which may be a micro-zone plate (MZP) of micro-ring gratings or an optical lens, generates a constructive ray point using phase-contrasting of the destructive interference point. The ray point is located between adjacent optical focal points. A method of generating a densely-accumulated ray point includes directing incident light onto the optical diffraction device, diffracting the selected wavelength onto the constructive interference focal point and the destructive interference focal point, and generating the densely-accumulated ray point in a narrow region.

  11. Concepts and applications for influenza antigenic cartography

    PubMed Central

    Cai, Zhipeng; Zhang, Tong; Wan, Xiu-Feng

    2011-01-01

    Influenza antigenic cartography projects influenza antigens into a two or three dimensional map based on immunological datasets, such as hemagglutination inhibition and microneutralization assays. A robust antigenic cartography can facilitate influenza vaccine strain selection since the antigenic map can simplify data interpretation through intuitive antigenic map. However, antigenic cartography construction is not trivial due to the challenging features embedded in the immunological data, such as data incompleteness, high noises, and low reactors. To overcome these challenges, we developed a computational method, temporal Matrix Completion-Multidimensional Scaling (MC-MDS), by adapting the low rank MC concept from the movie recommendation system in Netflix and the MDS method from geographic cartography construction. The application on H3N2 and 2009 pandemic H1N1 influenza A viruses demonstrates that temporal MC-MDS is effective and efficient in constructing influenza antigenic cartography. The web sever is available at http://sysbio.cvm.msstate.edu/AntigenMap. PMID:21761589

  12. A robust method of thin plate spline and its application to DEM construction

    NASA Astrophysics Data System (ADS)

    Chen, Chuanfa; Li, Yanyan

    2012-11-01

    In order to avoid the ill-conditioning problem of thin plate spline (TPS), the orthogonal least squares (OLS) method was introduced, and a modified OLS (MOLS) was developed. The MOLS of TPS (TPS-M) can not only select significant points, termed knots, from large and dense sampling data sets, but also easily compute the weights of the knots in terms of back-substitution. For interpolating large sampling points, we developed a local TPS-M, where some neighbor sampling points around the point being estimated are selected for computation. Numerical tests indicate that irrespective of sampling noise level, the average performance of TPS-M can advantage with smoothing TPS. Under the same simulation accuracy, the computational time of TPS-M decreases with the increase of the number of sampling points. The smooth fitting results on lidar-derived noise data indicate that TPS-M has an obvious smoothing effect, which is on par with smoothing TPS. The example of constructing a series of large scale DEMs, located in Shandong province, China, was employed to comparatively analyze the estimation accuracies of the two versions of TPS and the classical interpolation methods including inverse distance weighting (IDW), ordinary kriging (OK) and universal kriging with the second-order drift function (UK). Results show that regardless of sampling interval and spatial resolution, TPS-M is more accurate than the classical interpolation methods, except for the smoothing TPS at the finest sampling interval of 20 m, and the two versions of kriging at the spatial resolution of 15 m. In conclusion, TPS-M, which avoids the ill-conditioning problem, is considered as a robust method for DEM construction.

  13. Strategies to enhance waste minimization and energy conservation within organizations: a case study from the UK construction sector.

    PubMed

    Jones, Jo; Jackson, Janet; Tudor, Terry; Bates, Margaret

    2012-09-01

    Strategies for enhancing environmental management are a key focus for the government in the UK. Using a manufacturing company from the construction sector as a case study, this paper evaluates selected interventionist techniques, including environmental teams, awareness raising and staff training to improve environmental performance. The study employed a range of methods including questionnaire surveys and audits of energy consumption and generation of waste to examine the outcomes of the selected techniques. The results suggest that initially environmental management was not a focus for either the employees or the company. However, as a result of employing the techniques, the company was able to reduce energy consumption, increase recycling rates and achieve costs savings in excess of £132,000.

  14. Evaluation of accuracy of shade selection using two spectrophotometer systems: Vita Easyshade and Degudent Shadepilot

    PubMed Central

    Kalantari, Mohammad Hassan; Ghoraishian, Seyed Ahmad; Mohaghegh, Mina

    2017-01-01

    Objective: The aim of this in vitro study was to evaluate the accuracy of shade matching using two spectrophotometric devices. Materials and Methods: Thirteen patients who require a full coverage restoration for one of their maxillary central incisors were selected while the adjacent central incisor was intact. 3 same frameworks were constructed for each tooth using computer-aided design and computer-aided manufacturing technology. Shade matching was performed using Vita Easyshade spectrophotometer, Shadepilot spectrophotometer, and Vitapan classical shade guide for the first, second, and third crown subsequently. After application, firing, and glazing of the porcelain, the color was evaluated and scored by five inspectors. Results: Both spectrophotometric systems showed significantly better results than visual method (P < 0.05) while there were no significant differences between Vita Easyshade and Shadepilot spectrophotometers (P < 0.05). Conclusion: Spectrophotometers are a good substitute for visual color selection methods. PMID:28729792

  15. Absolute cosine-based SVM-RFE feature selection method for prostate histopathological grading.

    PubMed

    Sahran, Shahnorbanun; Albashish, Dheeb; Abdullah, Azizi; Shukor, Nordashima Abd; Hayati Md Pauzi, Suria

    2018-04-18

    Feature selection (FS) methods are widely used in grading and diagnosing prostate histopathological images. In this context, FS is based on the texture features obtained from the lumen, nuclei, cytoplasm and stroma, all of which are important tissue components. However, it is difficult to represent the high-dimensional textures of these tissue components. To solve this problem, we propose a new FS method that enables the selection of features with minimal redundancy in the tissue components. We categorise tissue images based on the texture of individual tissue components via the construction of a single classifier and also construct an ensemble learning model by merging the values obtained by each classifier. Another issue that arises is overfitting due to the high-dimensional texture of individual tissue components. We propose a new FS method, SVM-RFE(AC), that integrates a Support Vector Machine-Recursive Feature Elimination (SVM-RFE) embedded procedure with an absolute cosine (AC) filter method to prevent redundancy in the selected features of the SV-RFE and an unoptimised classifier in the AC. We conducted experiments on H&E histopathological prostate and colon cancer images with respect to three prostate classifications, namely benign vs. grade 3, benign vs. grade 4 and grade 3 vs. grade 4. The colon benchmark dataset requires a distinction between grades 1 and 2, which are the most difficult cases to distinguish in the colon domain. The results obtained by both the single and ensemble classification models (which uses the product rule as its merging method) confirm that the proposed SVM-RFE(AC) is superior to the other SVM and SVM-RFE-based methods. We developed an FS method based on SVM-RFE and AC and successfully showed that its use enabled the identification of the most crucial texture feature of each tissue component. Thus, it makes possible the distinction between multiple Gleason grades (e.g. grade 3 vs. grade 4) and its performance is far superior to other reported FS methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Rapid Generation of Marker-Free P. falciparum Fluorescent Reporter Lines Using Modified CRISPR/Cas9 Constructs and Selection Protocol.

    PubMed

    Mogollon, Catherin Marin; van Pul, Fiona J A; Imai, Takashi; Ramesar, Jai; Chevalley-Maurel, Séverine; de Roo, Guido M; Veld, Sabrina A J; Kroeze, Hans; Franke-Fayard, Blandine M D; Janse, Chris J; Khan, Shahid M

    2016-01-01

    The CRISPR/Cas9 system is a powerful genome editing technique employed in a wide variety of organisms including recently the human malaria parasite, P. falciparum. Here we report on further improvements to the CRISPR/Cas9 transfection constructs and selection protocol to more rapidly modify the P. falciparum genome and to introduce transgenes into the parasite genome without the inclusion of drug-selectable marker genes. This method was used to stably integrate the gene encoding GFP into the P. falciparum genome under the control of promoters of three different Plasmodium genes (calmodulin, gapdh and hsp70). These genes were selected as they are highly transcribed in blood stages. We show that the three reporter parasite lines generated in this study (GFP@cam, GFP@gapdh and GFP@hsp70) have in vitro blood stage growth kinetics and drug-sensitivity profiles comparable to the parental P. falciparum (NF54) wild-type line. Both asexual and sexual blood stages of the three reporter lines expressed GFP-fluorescence with GFP@hsp70 having the highest fluorescent intensity in schizont stages as shown by flow cytometry analysis of GFP-fluorescence intensity. The improved CRISPR/Cas9 constructs/protocol will aid in the rapid generation of transgenic and modified P. falciparum parasites, including those expressing different reporters proteins under different (stage specific) promoters.

  17. Rapid Generation of Marker-Free P. falciparum Fluorescent Reporter Lines Using Modified CRISPR/Cas9 Constructs and Selection Protocol

    PubMed Central

    Mogollon, Catherin Marin; van Pul, Fiona J. A.; Imai, Takashi; Ramesar, Jai; Chevalley-Maurel, Séverine; de Roo, Guido M.; Veld, Sabrina A. J.; Kroeze, Hans; Franke-Fayard, Blandine M. D.; Janse, Chris J.

    2016-01-01

    The CRISPR/Cas9 system is a powerful genome editing technique employed in a wide variety of organisms including recently the human malaria parasite, P. falciparum. Here we report on further improvements to the CRISPR/Cas9 transfection constructs and selection protocol to more rapidly modify the P. falciparum genome and to introduce transgenes into the parasite genome without the inclusion of drug-selectable marker genes. This method was used to stably integrate the gene encoding GFP into the P. falciparum genome under the control of promoters of three different Plasmodium genes (calmodulin, gapdh and hsp70). These genes were selected as they are highly transcribed in blood stages. We show that the three reporter parasite lines generated in this study (GFP@cam, GFP@gapdh and GFP@hsp70) have in vitro blood stage growth kinetics and drug-sensitivity profiles comparable to the parental P. falciparum (NF54) wild-type line. Both asexual and sexual blood stages of the three reporter lines expressed GFP-fluorescence with GFP@hsp70 having the highest fluorescent intensity in schizont stages as shown by flow cytometry analysis of GFP-fluorescence intensity. The improved CRISPR/Cas9 constructs/protocol will aid in the rapid generation of transgenic and modified P. falciparum parasites, including those expressing different reporters proteins under different (stage specific) promoters. PMID:27997583

  18. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    NASA Astrophysics Data System (ADS)

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  19. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection

    PubMed Central

    Dang, Yaoguo; Mao, Wenxin

    2018-01-01

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method. PMID:29510521

  20. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection.

    PubMed

    Sun, Huifang; Dang, Yaoguo; Mao, Wenxin

    2018-03-03

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method.

  1. A New Direction of Cancer Classification: Positive Effect of Low-Ranking MicroRNAs.

    PubMed

    Li, Feifei; Piao, Minghao; Piao, Yongjun; Li, Meijing; Ryu, Keun Ho

    2014-10-01

    Many studies based on microRNA (miRNA) expression profiles showed a new aspect of cancer classification. Because one characteristic of miRNA expression data is the high dimensionality, feature selection methods have been used to facilitate dimensionality reduction. The feature selection methods have one shortcoming thus far: they just consider the problem of where feature to class is 1:1 or n:1. However, because one miRNA may influence more than one type of cancer, human miRNA is considered to be ranked low in traditional feature selection methods and are removed most of the time. In view of the limitation of the miRNA number, low-ranking miRNAs are also important to cancer classification. We considered both high- and low-ranking features to cover all problems (1:1, n:1, 1:n, and m:n) in cancer classification. First, we used the correlation-based feature selection method to select the high-ranking miRNAs, and chose the support vector machine, Bayes network, decision tree, k-nearest-neighbor, and logistic classifier to construct cancer classification. Then, we chose Chi-square test, information gain, gain ratio, and Pearson's correlation feature selection methods to build the m:n feature subset, and used the selected miRNAs to determine cancer classification. The low-ranking miRNA expression profiles achieved higher classification accuracy compared with just using high-ranking miRNAs in traditional feature selection methods. Our results demonstrate that the m:n feature subset made a positive impression of low-ranking miRNAs in cancer classification.

  2. Investigating the Advantages of Constructing Multidigit Numeration Understanding through Oneida and Lakota Native Languages.

    ERIC Educational Resources Information Center

    Hankes, Judith Elaine

    This paper documents a culturally specific language strength for developing number sense among Oneida- and Lakota-speaking primary students. Qualitative research methods scaffolded this research study: culture informants were interviewed and interviews were transcribed and coded for analysis; culture documents were selected for analysis; and…

  3. Using Scenarios and Simulations to Plan Colleges

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    2004-01-01

    Using a case study, this article describes a method by which higher education institutions construct and use multiple future scenarios and simulations to plan strategically: to create visions of their futures, chart broad directions (mission and goals), and select learning and delivery strategies so as to achieve those broad directions. The…

  4. Discovering new knowledge about trees and forests. Selected papers from a meeting of IUFRO subject group 6.09: Philosophy and methods of forest research; 1985 August 19-23; Houghton, MI.

    Treesearch

    Rolfe E. Leary

    1989-01-01

    Presents fifteen papers and four abstracts in five topic areas: the research process, forestry constructs and innovations, interdisciplinarity, emerging research areas, and assessing research productivity, quality, and motivating scientists.

  5. Unsupervised MDP Value Selection for Automating ITS Capabilities

    ERIC Educational Resources Information Center

    Stamper, John; Barnes, Tiffany

    2009-01-01

    We seek to simplify the creation of intelligent tutors by using student data acquired from standard computer aided instruction (CAI) in conjunction with educational data mining methods to automatically generate adaptive hints. In our previous work, we have automatically generated hints for logic tutoring by constructing a Markov Decision Process…

  6. CONSTRUCTION OF EDUCATIONAL THEORY MODELS.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    THIS STUDY DELINEATED MODELS WHICH HAVE POTENTIAL USE IN GENERATING EDUCATIONAL THEORY. A THEORY MODELS METHOD WAS FORMULATED. BY SELECTING AND ORDERING CONCEPTS FROM OTHER DISCIPLINES, THE INVESTIGATORS FORMULATED SEVEN THEORY MODELS. THE FINAL STEP OF DEVISING EDUCATIONAL THEORY FROM THE THEORY MODELS WAS PERFORMED ONLY TO THE EXTENT REQUIRED TO…

  7. Classification and quantitation of milk powder by near-infrared spectroscopy and mutual information-based variable selection and partial least squares

    NASA Astrophysics Data System (ADS)

    Chen, Hui; Tan, Chao; Lin, Zan; Wu, Tong

    2018-01-01

    Milk is among the most popular nutrient source worldwide, which is of great interest due to its beneficial medicinal properties. The feasibility of the classification of milk powder samples with respect to their brands and the determination of protein concentration is investigated by NIR spectroscopy along with chemometrics. Two datasets were prepared for experiment. One contains 179 samples of four brands for classification and the other contains 30 samples for quantitative analysis. Principal component analysis (PCA) was used for exploratory analysis. Based on an effective model-independent variable selection method, i.e., minimal-redundancy maximal-relevance (MRMR), only 18 variables were selected to construct a partial least-square discriminant analysis (PLS-DA) model. On the test set, the PLS-DA model based on the selected variable set was compared with the full-spectrum PLS-DA model, both of which achieved 100% accuracy. In quantitative analysis, the partial least-square regression (PLSR) model constructed by the selected subset of 260 variables outperforms significantly the full-spectrum model. It seems that the combination of NIR spectroscopy, MRMR and PLS-DA or PLSR is a powerful tool for classifying different brands of milk and determining the protein content.

  8. Improvement of Predictive Ability by Uniform Coverage of the Target Genetic Space

    PubMed Central

    Bustos-Korts, Daniela; Malosetti, Marcos; Chapman, Scott; Biddulph, Ben; van Eeuwijk, Fred

    2016-01-01

    Genome-enabled prediction provides breeders with the means to increase the number of genotypes that can be evaluated for selection. One of the major challenges in genome-enabled prediction is how to construct a training set of genotypes from a calibration set that represents the target population of genotypes, where the calibration set is composed of a training and validation set. A random sampling protocol of genotypes from the calibration set will lead to low quality coverage of the total genetic space by the training set when the calibration set contains population structure. As a consequence, predictive ability will be affected negatively, because some parts of the genotypic diversity in the target population will be under-represented in the training set, whereas other parts will be over-represented. Therefore, we propose a training set construction method that uniformly samples the genetic space spanned by the target population of genotypes, thereby increasing predictive ability. To evaluate our method, we constructed training sets alongside with the identification of corresponding genomic prediction models for four genotype panels that differed in the amount of population structure they contained (maize Flint, maize Dent, wheat, and rice). Training sets were constructed using uniform sampling, stratified-uniform sampling, stratified sampling and random sampling. We compared these methods with a method that maximizes the generalized coefficient of determination (CD). Several training set sizes were considered. We investigated four genomic prediction models: multi-locus QTL models, GBLUP models, combinations of QTL and GBLUPs, and Reproducing Kernel Hilbert Space (RKHS) models. For the maize and wheat panels, construction of the training set under uniform sampling led to a larger predictive ability than under stratified and random sampling. The results of our methods were similar to those of the CD method. For the rice panel, all training set construction methods led to similar predictive ability, a reflection of the very strong population structure in this panel. PMID:27672112

  9. Molecularly imprinted covalent organic polymers for the selective extraction of benzoxazole fluorescent whitening agents from food samples.

    PubMed

    Ding, Hui; Wang, Rongyu; Wang, Xiao; Ji, Wenhua

    2018-06-21

    Molecularly imprinted covalent organic polymers were constructed by an imine-linking reaction between 1,3,5-triformylphloroglucinol and 2,6-diaminopyridine and used for the selective solid-phase extraction of benzoxazole fluorescent whitening agents from food samples. Binding experiments showed that imprinting sites on molecularly imprinted polymers had higher selectivity for targets compared with those of the corresponding non-imprinted polymers. Parameters affecting the solid-phase extraction procedure were examined. Under optimal conditions, actual samples were treated and the eluent was analyzed with high-performance liquid chromatography with diode-array detection. The results showed that the established method owned the wide linearity, satisfactory detection limits and quantification limits, and acceptable recoveries. Thus, this developed method possesses the practical potential to the selectively determine benzoxazole fluorescent whitening agents in complex food samples. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Windows for New Construction | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  11. NASA Tech House: An early evaluation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    An architect-engineering firm, as well as university participants, performed system studies, evaluated construction methods, performed cost effectiveness studies, and prepared construction drawings which incorporated the selected technology features into a final design. A Technology Utilization House (Tech House) based on this design was constructed at the NASA Langley Research Center in Hampton, Virginia. The Tech House is instrumented so that the performance of the design features and energy systems can be evaluated during a planned family live-in period. As such, the house is both a demonstration unit and a research laboratory. The Tech House is to demonstrate the kind of single-family residence that will probably be available within the next five years.

  12. Construction of a virtual combinatorial library using SMILES strings to discover potential structure-diverse PPAR modulators.

    PubMed

    Liao, Chenzhong; Liu, Bing; Shi, Leming; Zhou, Jiaju; Lu, Xian-Ping

    2005-07-01

    Based on the structural characters of PPAR modulators, a virtual combinatorial library containing 1226,625 compounds was constructed using SMILES strings. Selected ADME filters were employed to compel compounds having poor drug-like properties from this library. This library was converted to sdf and mol2 files by CONCORD 4.0, and was then docked to PPARgamma by DOCK 4.0 to identify new chemical entities that may be potential drug leads against type 2 diabetes and other metabolic diseases. The method to construct virtual combinatorial library using SMILES strings was further visualized by Visual Basic.net that can facilitate the needs of generating other type virtual combinatorial libraries.

  13. Habitat suitability criteria via parametric distributions: estimation, model selection and uncertainty

    USGS Publications Warehouse

    Som, Nicholas A.; Goodman, Damon H.; Perry, Russell W.; Hardy, Thomas B.

    2016-01-01

    Previous methods for constructing univariate habitat suitability criteria (HSC) curves have ranged from professional judgement to kernel-smoothed density functions or combinations thereof. We present a new method of generating HSC curves that applies probability density functions as the mathematical representation of the curves. Compared with previous approaches, benefits of our method include (1) estimation of probability density function parameters directly from raw data, (2) quantitative methods for selecting among several candidate probability density functions, and (3) concise methods for expressing estimation uncertainty in the HSC curves. We demonstrate our method with a thorough example using data collected on the depth of water used by juvenile Chinook salmon (Oncorhynchus tschawytscha) in the Klamath River of northern California and southern Oregon. All R code needed to implement our example is provided in the appendix. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  14. Encoded Library Synthesis Using Chemical Ligation and the Discovery of sEH Inhibitors from a 334-Million Member Library

    PubMed Central

    Litovchick, Alexander; Dumelin, Christoph E.; Habeshian, Sevan; Gikunju, Diana; Guié, Marie-Aude; Centrella, Paolo; Zhang, Ying; Sigel, Eric A.; Cuozzo, John W.; Keefe, Anthony D.; Clark, Matthew A.

    2015-01-01

    A chemical ligation method for construction of DNA-encoded small-molecule libraries has been developed. Taking advantage of the ability of the Klenow fragment of DNA polymerase to accept templates with triazole linkages in place of phosphodiesters, we have designed a strategy for chemically ligating oligonucleotide tags using cycloaddition chemistry. We have utilized this strategy in the construction and selection of a small molecule library, and successfully identified inhibitors of the enzyme soluble epoxide hydrolase. PMID:26061191

  15. Paleodemographic age-at-death distributions of two Mexican skeletal collections: a comparison of transition analysis and traditional aging methods.

    PubMed

    Bullock, Meggan; Márquez, Lourdes; Hernández, Patricia; Ruíz, Fernando

    2013-09-01

    Traditional methods of aging adult skeletons suffer from the problem of age mimicry of the reference collection, as described by Bocquet-Appel and Masset (1982). Transition analysis (Boldsen et al., 2002) is a method of aging adult skeletons that addresses the problem of age mimicry of the reference collection by allowing users to select an appropriate prior probability. In order to evaluate whether transition analysis results in significantly different age estimates for adults, the method was applied to skeletal collections from Postclassic Cholula and Contact-Period Xochimilco. The resulting age-at-death distributions were then compared with age-at-death distributions for the two populations constructed using traditional aging methods. Although the traditional aging methods result in age-at-death distributions with high young adult mortality and few individuals living past the age of 50, the age-at-death distributions constructed using transition analysis indicate that most individuals who lived into adulthood lived past the age of 50. Copyright © 2013 Wiley Periodicals, Inc.

  16. Selectivity Mechanism of ATP-Competitive Inhibitors for PKB and PKA.

    PubMed

    Wu, Ke; Pang, Jingzhi; Song, Dong; Zhu, Ying; Wu, Congwen; Shao, Tianqu; Chen, Haifeng

    2015-07-01

    Protein kinase B (PKB) acts as a central node on the PI3K kinase pathway. Constitutive activation and overexpression of PKB have been identified to involve in various cancers. However, protein kinase A (PKA) sharing high homology with PKB is essential for metabolic regulation. Therefore, specific targeting on PKB is crucial strategy in drug design and development for antitumor. Here, we had revealed the selectivity mechanism for PKB inhibitors with molecular dynamics simulation and 3D-QSAR methods. Selective inhibitors of PKB could form more hydrogen bonds and hydrophobic contacts with PKB than those with PKA. This could explain that selective inhibitor M128 is more potent to PKB than to PKA. Then, 3D-QSAR models were constructed for these selective inhibitors and evaluated by test set compounds. 3D-QSAR model comparison of PKB inhibitors and PKA inhibitors reveals possible methods to improve the selectivity of inhibitors. These models can be used to design new chemical entities and make quantitative prediction of the specific selective inhibitors before resorting to in vitro and in vivo experiment. © 2014 John Wiley & Sons A/S.

  17. Wire bonding quality monitoring via refining process of electrical signal from ultrasonic generator

    NASA Astrophysics Data System (ADS)

    Feng, Wuwei; Meng, Qingfeng; Xie, Youbo; Fan, Hong

    2011-04-01

    In this paper, a technique for on-line quality detection of ultrasonic wire bonding is developed. The electrical signals from the ultrasonic generator supply, namely, voltage and current, are picked up by a measuring circuit and transformed into digital signals by a data acquisition system. A new feature extraction method is presented to characterize the transient property of the electrical signals and further evaluate the bond quality. The method includes three steps. First, the captured voltage and current are filtered by digital bandpass filter banks to obtain the corresponding subband signals such as fundamental signal, second harmonic, and third harmonic. Second, each subband envelope is obtained using the Hilbert transform for further feature extraction. Third, the subband envelopes are, respectively, separated into three phases, namely, envelope rising, stable, and damping phases, to extract the tiny waveform changes. The different waveform features are extracted from each phase of these subband envelopes. The principal components analysis (PCA) method is used for the feature selection in order to remove the relevant information and reduce the dimension of original feature variables. Using the selected features as inputs, an artificial neural network (ANN) is constructed to identify the complex bond fault pattern. By analyzing experimental data with the proposed feature extraction method and neural network, the results demonstrate the advantages of the proposed feature extraction method and the constructed artificial neural network in detecting and identifying bond quality.

  18. Construction of optical glucose nanobiosensor with high sensitivity and selectivity at physiological pH on the basis of organic-inorganic hybrid microgels.

    PubMed

    Wu, Weitai; Zhou, Ting; Aiello, Michael; Zhou, Shuiqin

    2010-08-15

    A new class of optical glucose nanobiosensors with high sensitivity and selectivity at physiological pH is described. To construct these glucose nanobiosensors, the fluorescent CdS quantum dots (QDs), serving as the optical code, were incorporated into the glucose-sensitive poly(N-isopropylacrylamide-acrylamide-2-acrylamidomethyl-5-fluorophenylboronic acid) copolymer microgels, via both in situ growth method and "breathing in" method, respectively. The polymeric gel can adapt to surrounding glucose concentrations, and regulate the fluorescence of the embedded QDs, converting biochemical signals into optical signals. The gradual swelling of the gel would lead to the quenching of the fluorescence at the elevated glucose concentrations. The hybrid microgels displayed high selectivity to glucose over the potential primary interferents of lactate and human serum albumin in the physiologically important glucose concentration range. The stability, reversibility, and sensitivity of the organic-inorganic hybrid microgel-based biosensors were also systematically studied. These general properties of our nanobiosensors are well tunable under appropriate tailor on the hybrid microgels, in particular, simply through the change in the crosslinking degree of the microgels. The optical glucose nanobiosensors based on the organic-inorganic hybrid microgels have shown the potential for a third generation fluorescent biosensor. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Study on the Selection of Equipment Suppliers for Wind Power Generation EPC Project

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyue; Li, Huimin

    2017-12-01

    In the EPC project, the purchase cost of equipments accounted for about 60% of the total project cost, thus, the selection of equipment suppliers has an important influence on the EPC project. This paper, took EPC project for the phase I engineering of Guizhou Huaxi Yunding wind power plant as research background, constructed the evaluation index system for the selection of equipment suppliers for wind power generation EPC project from multiple perspectives, and introduced matter-element extension evaluation model to evaluate the selection of equipment suppliers for this project from the qualitative and quantitative point of view. The result is consistent with the actual situation, which verifies the validity and operability of this method.

  20. The dimension split element-free Galerkin method for three-dimensional potential problems

    NASA Astrophysics Data System (ADS)

    Meng, Z. J.; Cheng, H.; Ma, L. D.; Cheng, Y. M.

    2018-06-01

    This paper presents the dimension split element-free Galerkin (DSEFG) method for three-dimensional potential problems, and the corresponding formulae are obtained. The main idea of the DSEFG method is that a three-dimensional potential problem can be transformed into a series of two-dimensional problems. For these two-dimensional problems, the improved moving least-squares (IMLS) approximation is applied to construct the shape function, which uses an orthogonal function system with a weight function as the basis functions. The Galerkin weak form is applied to obtain a discretized system equation, and the penalty method is employed to impose the essential boundary condition. The finite difference method is selected in the splitting direction. For the purposes of demonstration, some selected numerical examples are solved using the DSEFG method. The convergence study and error analysis of the DSEFG method are presented. The numerical examples show that the DSEFG method has greater computational precision and computational efficiency than the IEFG method.

  1. Why developmental niche construction is not selective niche construction: and why it matters

    PubMed Central

    2017-01-01

    In the last decade, niche construction has been heralded as the neglected process in evolution. But niche construction is just one way in which the organism's interaction with and construction of the environment can have potential evolutionary significance. The constructed environment does not just select for, it also produces new variation. Nearly 3 decades ago, and in parallel with Odling-Smee's article ‘Niche-constructing phenotypes', West and King introduced the ‘ontogenetic niche’ to give the phenomena of exogenetic inheritance a formal name. Since then, a range of fields in the life sciences and medicine has amassed evidence that parents influence their offspring by means other than DNA (parental effects), and proposed mechanisms for how heritable variation can be environmentally induced and developmentally regulated. The concept of ‘developmental niche construction’ (DNC) elucidates how a diverse range of mechanisms contributes to the transgenerational transfer of developmental resources. My most central of claims is that whereas the selective niche of niche construction theory is primarily used to explain the active role of the organism in its selective environment, DNC is meant to indicate the active role of the organism in its developmental environment. The paper highlights the differences between the construction of the selective and the developmental niche, and explores the overall significance of DNC for evolutionary theory. PMID:28839923

  2. Selection is more intelligent than design: improving the affinity of a bivalent ligand through directed evolution.

    PubMed

    Ahmad, Kareem M; Xiao, Yi; Soh, H Tom

    2012-12-01

    Multivalent molecular interactions can be exploited to dramatically enhance the performance of an affinity reagent. The enhancement in affinity and specificity achieved with a multivalent construct depends critically on the effectiveness of the scaffold that joins the ligands, as this determines their positions and orientations with respect to the target molecule. Currently, no generalizable design rules exist for construction of an optimal multivalent ligand for targets with known structures, and the design challenge remains an insurmountable obstacle for the large number of proteins whose structures are not known. As an alternative to such design-based strategies, we report here a directed evolution-based method for generating optimal bivalent aptamers. To demonstrate this approach, we fused two thrombin aptamers with a randomized DNA sequence and used a microfluidic in vitro selection strategy to isolate scaffolds with exceptionally high affinities. Within five rounds of selection, we generated a bivalent aptamer that binds thrombin with an apparent dissociation constant (K(d)) <10 pM, representing a ∼200-fold improvement in binding affinity over the monomeric aptamers and a ∼15-fold improvement over the best designed bivalent construct. The process described here can be used to produce high-affinity multivalent aptamers and could potentially be adapted to other classes of biomolecules.

  3. Mojo Hand, a TALEN design tool for genome editing applications.

    PubMed

    Neff, Kevin L; Argue, David P; Ma, Alvin C; Lee, Han B; Clark, Karl J; Ekker, Stephen C

    2013-01-16

    Recent studies of transcription activator-like (TAL) effector domains fused to nucleases (TALENs) demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org). We describe the algorithm and its implementation. The features of Mojo Hand include (1) automatic download of genomic data from the National Center for Biotechnology Information, (2) analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3) selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4) output files designed for subsequent TALEN construction using the Golden Gate assembly method. Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.

  4. Runaway cultural niche construction

    PubMed Central

    Rendell, Luke; Fogarty, Laurel; Laland, Kevin N.

    2011-01-01

    Cultural niche construction is a uniquely potent source of selection on human populations, and a major cause of recent human evolution. Previous theoretical analyses have not, however, explored the local effects of cultural niche construction. Here, we use spatially explicit coevolutionary models to investigate how cultural processes could drive selection on human genes by modifying local resources. We show that cultural learning, expressed in local niche construction, can trigger a process with dynamics that resemble runaway sexual selection. Under a broad range of conditions, cultural niche-constructing practices generate selection for gene-based traits and hitchhike to fixation through the build up of statistical associations between practice and trait. This process can occur even when the cultural practice is costly, or is subject to counteracting transmission biases, or the genetic trait is selected against. Under some conditions a secondary hitchhiking occurs, through which genetic variants that enhance the capability for cultural learning are also favoured by similar dynamics. We suggest that runaway cultural niche construction could have played an important role in human evolution, helping to explain why humans are simultaneously the species with the largest relative brain size, the most potent capacity for niche construction and the greatest reliance on culture. PMID:21320897

  5. Cross-cultural comparison of concrete recycling decision-making and implementation in construction industry.

    PubMed

    Tam, Vivian W Y; Tam, Leona; Le, Khoa N

    2010-02-01

    Waste management is pressing very hard with alarming signals in construction industry. Concrete waste constituents major proportions of construction and demolition waste of 81% in Australia. To minimize concrete waste generated from construction activities, recycling concrete waste is one of the best methods to conserve the environment. This paper investigates concrete recycling implementation in construction. Japan is a leading country in recycling concrete waste, which has been implementing 98% recycling and using it for structural concrete applications. Hong Kong is developing concrete recycling programs for high-grade applications. Australia is making relatively slow progress in implementing concrete recycling in construction. Therefore, empirical studies in Australia, Hong Kong, and Japan were selected in this paper. A questionnaire survey and structured interviews were conducted. Power spectrum was used for analysis. It was found that "increasing overall business competitiveness and strategic business opportunities" was considered as the major benefit for concrete recycling from Hong Kong and Japanese respondents, while "rising concrete recycling awareness such as selecting suitable resources, techniques and training and compliance with regulations" was considered as the major benefit from Australian respondents. However, "lack of clients' support", "increase in management cost" and "increase in documentation workload, such as working documents, procedures and tools" were the major difficulties encountered from Australian, Hong Kong, and Japanese respondents, respectively. To improve the existing implementation, "inclusion of concrete recycling evaluation in tender appraisal" and "defining clear legal evaluation of concrete recycling" were major recommendations for Australian and Hong Kong, and Japanese respondents, respectively.

  6. Materials and techniques for model construction

    NASA Technical Reports Server (NTRS)

    Wigley, D. A.

    1985-01-01

    The problems confronting the designer of models for cryogenic wind tunnel models are discussed with particular reference to the difficulties in obtaining appropriate data on the mechanical and physical properties of candidate materials and their fabrication technologies. The relationship between strength and toughness of alloys is discussed in the context of maximizing both and avoiding the problem of dimensional and microstructural instability. All major classes of materials used in model construction are considered in some detail and in the Appendix selected numerical data is given for the most relevant materials. The stepped-specimen program to investigate stress-induced dimensional changes in alloys is discussed in detail together with interpretation of the initial results. The methods used to bond model components are considered with particular reference to the selection of filler alloys and temperature cycles to avoid microstructural degradation and loss of mechanical properties.

  7. Tag SNP selection via a genetic algorithm.

    PubMed

    Mahdevar, Ghasem; Zahiri, Javad; Sadeghi, Mehdi; Nowzari-Dalini, Abbas; Ahrabian, Hayedeh

    2010-10-01

    Single Nucleotide Polymorphisms (SNPs) provide valuable information on human evolutionary history and may lead us to identify genetic variants responsible for human complex diseases. Unfortunately, molecular haplotyping methods are costly, laborious, and time consuming; therefore, algorithms for constructing full haplotype patterns from small available data through computational methods, Tag SNP selection problem, are convenient and attractive. This problem is proved to be an NP-hard problem, so heuristic methods may be useful. In this paper we present a heuristic method based on genetic algorithm to find reasonable solution within acceptable time. The algorithm was tested on a variety of simulated and experimental data. In comparison with the exact algorithm, based on brute force approach, results show that our method can obtain optimal solutions in almost all cases and runs much faster than exact algorithm when the number of SNP sites is large. Our software is available upon request to the corresponding author.

  8. Military Housing Privatization Initiative (MHPI), Eglin AFB, Florida and Hurlburt Field, Florida. Final Environmental Impact Statement

    DTIC Science & Technology

    2011-05-01

    There are several different methods available for determining stormwater runoff peak flows. Two of the most widely used methods are the Rational...environmental factors between the alternatives differ in terms of their respective potential for adverse effects relative to their location. ENVIRONMENTAL...Force selects a development proposal. As a result, the actual project scope may result in different numbers of units constructed or demolished, or

  9. Comprehensive evaluation of global energy interconnection development index

    NASA Astrophysics Data System (ADS)

    Liu, Lin; Zhang, Yi

    2018-04-01

    Under the background of building global energy interconnection and realizing green and low-carbon development, this article constructed the global energy interconnection development index system which based on the current situation of global energy interconnection development. Through using the entropy method for the weight analysis of global energy interconnection development index, and then using gray correlation method to analyze the selected countries, this article got the global energy interconnection development index ranking and level classification.

  10. Palladium-catalyzed Br/D exchange of arenes: Selective deuterium incorporation with versatile functional group tolerance and high efficiency

    DOE PAGES

    Zhang, Honghai -Hai; Bonnesen, Peter V.; Hong, Kunlun

    2015-07-13

    There is a facile method for introducing one or more deuterium atoms onto an aromatic nucleus via Br/D exchange with high functional group tolerance and high incorporation efficiency is disclosed. Deuterium-labeled aryl chlorides and aryl borates which could be used as substrates in cross-coupling reactions to construct more complicated deuterium-labeled compounds can also be synthesized by this method.

  11. 48 CFR 836.602-4 - Selection authority.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-4 Selection authority. The Director, Office of Construction and Facilities Management (for Central Office contracts), the Director, Office of Construction Management (for National Cemetery Administration contracts...

  12. 48 CFR 836.602-4 - Selection authority.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-4 Selection authority. The Director, Office of Construction and Facilities Management (for Central Office contracts), the Director, Office of Construction Management (for National Cemetery Administration contracts...

  13. 48 CFR 836.602-4 - Selection authority.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-4 Selection authority. The Director, Office of Construction and Facilities Management (for Central Office contracts), the Director, Office of Construction Management (for National Cemetery Administration contracts...

  14. 48 CFR 836.602-4 - Selection authority.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-4 Selection authority. The Director, Office of Construction and Facilities Management (for Central Office contracts), the Director, Office of Construction Management (for National Cemetery Administration contracts...

  15. Fashion Students Choose How to Learn by Constructing Videos of Pattern Making

    ERIC Educational Resources Information Center

    Cavanagh, Michaella; Peté, Marí

    2017-01-01

    This paper analyses new learning experiences of first year pattern technology students at a university of technology, in the context of selected characteristics of authentic learning theories. The paper contributes to existing knowledge by proposing a method that could be followed for design-based subjects in a vocational education setting.…

  16. Constructing Core Journal Lists: Mixing Science and Alchemy.

    ERIC Educational Resources Information Center

    Corby, Katherine

    2003-01-01

    Via an overview of core journal studies, emphasizing the social sciences and education, this review looks for best practices in both motivation and methodology. Selection decisions receive particular focus. Lack of correlation between methods is indicative of the complexity of the topic and the need for judgment in design and use. (Author)

  17. The Quest for Quality. Sixteen Forms of Heresy in Higher Education.

    ERIC Educational Resources Information Center

    Goodlad, Sinclair

    This book is an exploration of the current debate about quality in higher education. Using a construct of "heresies," it suggests a set of guiding principles in four key areas of university life: curriculum (because selecting what is worth learning in universities is not random); teaching methods (because universities offer opportunities…

  18. Multiple Methods for Identifying Outcomes of a High Challenge Adventure Activity

    ERIC Educational Resources Information Center

    Davidson, Curt; Ewert, Alan; Chang, Yun

    2016-01-01

    The purpose of this study was to provide insight into what occurs in moments of high challenge within participants during an outdoor adventure education (OAE) program. Given the inherent risk and remote locations often associated with OAE programs, it has remained challenging to measure selected psychological constructs while the program is taking…

  19. Improving the Effectiveness of English Vocabulary Review by Integrating ARCS with Mobile Game-Based Learning

    ERIC Educational Resources Information Center

    Wu, Ting-Ting

    2018-01-01

    Memorizing English vocabulary is often considered uninteresting, and a lack of motivation exists during learning activities. Moreover, most vocabulary practice systems automatically select words from articles and do not provide integrated model methods for students. Therefore, this study constructed a mobile game-based English vocabulary practice…

  20. Drawing Inspiration from Human Brain Networks: Construction of Interconnected Virtual Networks

    PubMed Central

    Kominami, Daichi; Leibnitz, Kenji; Murata, Masayuki

    2018-01-01

    Virtualization of wireless sensor networks (WSN) is widely considered as a foundational block of edge/fog computing, which is a key technology that can help realize next-generation Internet of things (IoT) networks. In such scenarios, multiple IoT devices and service modules will be virtually deployed and interconnected over the Internet. Moreover, application services are expected to be more sophisticated and complex, thereby increasing the number of modifications required for the construction of network topologies. Therefore, it is imperative to establish a method for constructing a virtualized WSN (VWSN) topology that achieves low latency on information transmission and high resilience against network failures, while keeping the topological construction cost low. In this study, we draw inspiration from inter-modular connectivity in human brain networks, which achieves high performance when dealing with large-scale networks composed of a large number of modules (i.e., regions) and nodes (i.e., neurons). We propose a method for assigning inter-modular links based on a connectivity model observed in the cerebral cortex of the brain, known as the exponential distance rule (EDR) model. We then choose endpoint nodes of these links by controlling inter-modular assortativity, which characterizes the topological connectivity of brain networks. We test our proposed methods using simulation experiments. The results show that the proposed method based on the EDR model can construct a VWSN topology with an optimal combination of communication efficiency, robustness, and construction cost. Regarding the selection of endpoint nodes for the inter-modular links, the results also show that high assortativity enhances the robustness and communication efficiency because of the existence of inter-modular links of two high-degree nodes. PMID:29642483

  1. Automatic Extraction of Urban Built-Up Area Based on Object-Oriented Method and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Li, L.; Zhou, H.; Wen, Q.; Chen, T.; Guan, F.; Ren, B.; Yu, H.; Wang, Z.

    2018-04-01

    Built-up area marks the use of city construction land in the different periods of the development, the accurate extraction is the key to the studies of the changes of urban expansion. This paper studies the technology of automatic extraction of urban built-up area based on object-oriented method and remote sensing data, and realizes the automatic extraction of the main built-up area of the city, which saves the manpower cost greatly. First, the extraction of construction land based on object-oriented method, the main technical steps include: (1) Multi-resolution segmentation; (2) Feature Construction and Selection; (3) Information Extraction of Construction Land Based on Rule Set, The characteristic parameters used in the rule set mainly include the mean of the red band (Mean R), Normalized Difference Vegetation Index (NDVI), Ratio of residential index (RRI), Blue band mean (Mean B), Through the combination of the above characteristic parameters, the construction site information can be extracted. Based on the degree of adaptability, distance and area of the object domain, the urban built-up area can be quickly and accurately defined from the construction land information without depending on other data and expert knowledge to achieve the automatic extraction of the urban built-up area. In this paper, Beijing city as an experimental area for the technical methods of the experiment, the results show that: the city built-up area to achieve automatic extraction, boundary accuracy of 2359.65 m to meet the requirements. The automatic extraction of urban built-up area has strong practicality and can be applied to the monitoring of the change of the main built-up area of city.

  2. Drawing Inspiration from Human Brain Networks: Construction of Interconnected Virtual Networks.

    PubMed

    Murakami, Masaya; Kominami, Daichi; Leibnitz, Kenji; Murata, Masayuki

    2018-04-08

    Virtualization of wireless sensor networks (WSN) is widely considered as a foundational block of edge/fog computing, which is a key technology that can help realize next-generation Internet of things (IoT) networks. In such scenarios, multiple IoT devices and service modules will be virtually deployed and interconnected over the Internet. Moreover, application services are expected to be more sophisticated and complex, thereby increasing the number of modifications required for the construction of network topologies. Therefore, it is imperative to establish a method for constructing a virtualized WSN (VWSN) topology that achieves low latency on information transmission and high resilience against network failures, while keeping the topological construction cost low. In this study, we draw inspiration from inter-modular connectivity in human brain networks, which achieves high performance when dealing with large-scale networks composed of a large number of modules (i.e., regions) and nodes (i.e., neurons). We propose a method for assigning inter-modular links based on a connectivity model observed in the cerebral cortex of the brain, known as the exponential distance rule (EDR) model. We then choose endpoint nodes of these links by controlling inter-modular assortativity, which characterizes the topological connectivity of brain networks. We test our proposed methods using simulation experiments. The results show that the proposed method based on the EDR model can construct a VWSN topology with an optimal combination of communication efficiency, robustness, and construction cost. Regarding the selection of endpoint nodes for the inter-modular links, the results also show that high assortativity enhances the robustness and communication efficiency because of the existence of inter-modular links of two high-degree nodes.

  3. Fluid management in the optimization of space construction

    NASA Technical Reports Server (NTRS)

    Snyder, Howard

    1990-01-01

    Fluid management impacts strongly on the optimization of space construction. Large quantities of liquids are needed for propellants and life support. The mass of propellant liquids is comparable to that required for the structures. There may be a strong dynamic interaction between the stored liquids and the space structure unless the design minimizes the interaction. The constraints of cost and time required optimization of the supply/resupply strategy. The proper selection and design of the fluid management methods for: slosh control; stratification control; acquisition; transfer; gauging; venting; dumping; contamination control; selection of tank configuration and size; the storage state and the control system can improve the entire system performance substantially. Our effort consists of building mathematical/computer models of the various fluid management methods and testing them against the available experimental data. The results of the models are used as inputs to the system operations studies. During the past year, the emphasis has been on modeling: the transfer of cryogens; sloshing and the storage configuration. The work has been intermeshed with ongoing NASA design and development studies to leverage the funds provided by the Center.

  4. Measuring Alexithymia via Trait Approach-I: A Alexithymia Scale Item Selection and Formation of Factor Structure

    PubMed Central

    TATAR, Arkun; SALTUKOĞLU, Gaye; ALİOĞLU, Seda; ÇİMEN, Sümeyye; GÜVEN, Hülya; AY, Çağla Ebru

    2017-01-01

    Introduction It is not clear in the literature whether available instruments are sufficient to measure alexithymia because of its theoretical structure. Moreover, it has been reported that several measuring instruments are needed to measure this construct, and all the instruments have different error sources. The old and the new forms of Toronto Alexithymia Scale are the only instruments available in Turkish. Thus, the purpose of this study was to develop a new scale to measure alexithymia, selecting items and constructing the factor structure. Methods A total of 1117 patients aged from 19 to 82 years (mean = 35.05 years) were included. A 100-item pool was prepared and applied to 628 women and 489 men. Data were analyzed using Explanatory Factor Analysis, Confirmatory Factor Analysis, and Item Response Theory and 28 items were selected. The new form of 28 items was applied to 415 university students, including 271 women and 144 men aged from 18 to 30 (mean=21.44). Results The results of Explanatory Factor Analysis revealed a five-factor construct of “Solving and Expressing Affective Experiences,” “External Locused Cognitive Style,” “Tendency to Somatize Affections,” “Imaginary Life and Visualization,” and “Acting Impulsively,” along with a two-factor construct representing the “Affective” and “Cognitive” components. All the components of the construct showed good model fit and high internal consistency. The new form was tested in terms of internal consistency, test-retest reliability, and concurrent validity using Toronto Alexithymia Scale as criteria and discriminative validity using Five-Factor Personality Inventory Short Form. Conclusion The results showed that the new scale met the basic psychometric requirements. Results have been discussed in line with related studies. PMID:29033633

  5. Identifying Patients with Atrioventricular Septal Defect in Down Syndrome Populations by Using Self-Normalizing Neural Networks and Feature Selection.

    PubMed

    Pan, Xiaoyong; Hu, Xiaohua; Zhang, Yu Hang; Feng, Kaiyan; Wang, Shao Peng; Chen, Lei; Huang, Tao; Cai, Yu Dong

    2018-04-12

    Atrioventricular septal defect (AVSD) is a clinically significant subtype of congenital heart disease (CHD) that severely influences the health of babies during birth and is associated with Down syndrome (DS). Thus, exploring the differences in functional genes in DS samples with and without AVSD is a critical way to investigate the complex association between AVSD and DS. In this study, we present a computational method to distinguish DS patients with AVSD from those without AVSD using the newly proposed self-normalizing neural network (SNN). First, each patient was encoded by using the copy number of probes on chromosome 21. The encoded features were ranked by the reliable Monte Carlo feature selection (MCFS) method to obtain a ranked feature list. Based on this feature list, we used a two-stage incremental feature selection to construct two series of feature subsets and applied SNNs to build classifiers to identify optimal features. Results show that 2737 optimal features were obtained, and the corresponding optimal SNN classifier constructed on optimal features yielded a Matthew's correlation coefficient (MCC) value of 0.748. For comparison, random forest was also used to build classifiers and uncover optimal features. This method received an optimal MCC value of 0.582 when top 132 features were utilized. Finally, we analyzed some key features derived from the optimal features in SNNs found in literature support to further reveal their essential roles.

  6. Feasibility study of structured diagnosis methods for functional dyspepsia in Korean medicine clinics.

    PubMed

    Park, Jeong Hwan; Kim, Soyoung; Park, Jae-Woo; Ko, Seok-Jae; Lee, Sanghun

    2017-12-01

    Functional dyspepsia (FD) is the seventh most common disease encountered in Korean medicine (KM) clinics. Despite the large number of FD patients visiting KM clinics, the accumulated medical records have no utility in evidence development, due to being unstructured. This study aimed to construct a standard operating procedure (SOP) with appropriate structured diagnostic methods for FD, and assess the feasibility for use in KM clinics. Two rounds of professional surveys were conducted by 10 Korean internal medicine professors to select the representative diagnostic methods. A feasibility study was conducted to evaluate compliance and time required for using the structured diagnostic methods by three specialists in two hospitals. As per the results of the professional survey, five questionnaires and one basic diagnostic method were selected. An SOP was constructed based on the survey results, and a feasibility study showed that the SOP compliance score (out of 5) was 3.45 among the subjects, and 3.25 among the practitioners. The SOP was acceptable and was not deemed difficult to execute. The total execution time was 136.5 minutes, out of which the gastric emptying test time was 129 minutes. This feasibility study of the SOP with structured diagnostic methods for FD confirmed it was adequate for use in KM clinics. It is expected that these study findings will be helpful to clinicians who wish to conduct observational studies as well as to generate quantitative medical records to facilitate Big Data research.

  7. Exploring the Driving Factors of Construction Industrialization Development in China.

    PubMed

    Xiahou, Xiaer; Yuan, Jingfeng; Liu, Yan; Tang, Yuchun; Li, Qiming

    2018-03-03

    Construction industrialization (CI) has been adopted worldwide because of its potential benefits. However, current research shows the incentives for adopting CI may differ in different regions. While the promotion of CI in China is still at the initial stage, a systematical analysis of the driving factors would help decision makers get a comprehensive understanding of CI development and select proper strategies to promote CI. This research combines qualitative and quantitative methods to explore the construction industrialization driving factors (CIDFs) in China. The grounded theory method (GTM) was employed to explore CI concepts among 182 CI-related articles published in 10 top-tier journals from 2000 to 2017. A total of 15 CIDFs were identified, including one suggested by professionals during a pre-test questionnaire survey. The analysis showed that the development of CI in China is pushed by macrodevelopment and pulled by the government and is also a self-driven process. The major driving factors for CI adoption in China are the transformation and upgrade of the conventional construction industry and the solution of development dilemmas. Our study also suggests that pilot programs are, currently, the most effective method to promote CI in China and to accumulate experience so to gain recognition by the society. This research is also of value for CI promotion in other developing countries.

  8. Exploring the Driving Factors of Construction Industrialization Development in China

    PubMed Central

    Xiahou, Xiaer; Yuan, Jingfeng; Tang, Yuchun; Li, Qiming

    2018-01-01

    Construction industrialization (CI) has been adopted worldwide because of its potential benefits. However, current research shows the incentives for adopting CI may differ in different regions. While the promotion of CI in China is still at the initial stage, a systematical analysis of the driving factors would help decision makers get a comprehensive understanding of CI development and select proper strategies to promote CI. This research combines qualitative and quantitative methods to explore the construction industrialization driving factors (CIDFs) in China. The grounded theory method (GTM) was employed to explore CI concepts among 182 CI-related articles published in 10 top-tier journals from 2000 to 2017. A total of 15 CIDFs were identified, including one suggested by professionals during a pre-test questionnaire survey. The analysis showed that the development of CI in China is pushed by macrodevelopment and pulled by the government and is also a self-driven process. The major driving factors for CI adoption in China are the transformation and upgrade of the conventional construction industry and the solution of development dilemmas. Our study also suggests that pilot programs are, currently, the most effective method to promote CI in China and to accumulate experience so to gain recognition by the society. This research is also of value for CI promotion in other developing countries. PMID:29510507

  9. Mining a clinical data warehouse to discover disease-finding associations using co-occurrence statistics.

    PubMed

    Cao, Hui; Markatou, Marianthi; Melton, Genevieve B; Chiang, Michael F; Hripcsak, George

    2005-01-01

    This paper applies co-occurrence statistics to discover disease-finding associations in a clinical data warehouse. We used two methods, chi2 statistics and the proportion confidence interval (PCI) method, to measure the dependence of pairs of diseases and findings, and then used heuristic cutoff values for association selection. An intrinsic evaluation showed that 94 percent of disease-finding associations obtained by chi2 statistics and 76.8 percent obtained by the PCI method were true associations. The selected associations were used to construct knowledge bases of disease-finding relations (KB-chi2, KB-PCI). An extrinsic evaluation showed that both KB-chi2 and KB-PCI could assist in eliminating clinically non-informative and redundant findings from problem lists generated by our automated problem list summarization system.

  10. Biophysical evaluation of footwear for cold-weather climates.

    PubMed

    Santee, W R; Endrusick, T L

    1988-02-01

    Proper selection of footwear for cold-wet environments is important in determining individual performance and comfort. Testing only total dry insulation (It) is not a wholly adequate basis for boot selection. The present study demonstrates an effective method for evaluating the effects of surface moisture on boot insulation. This method allows a more knowledgeable selection of footwear for cold-wet climates. In this study, regional insulation values were obtained under dry conditions, then during a soak in shallow water, and finally for insulation recovery after removal from water. Results for seven boots show no advantage of presently used synthetic materials during short soak episodes. Insulated leather-synthetic boots, however, recovered to dry insulation levels more rapidly than more traditional insulated leather boots. Rubber waterproof bottoms were the most effective boot construction for retaining insulation levels during water exposure.

  11. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  12. Free variable selection QSPR study to predict 19F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods

    NASA Astrophysics Data System (ADS)

    Goudarzi, Nasser

    2016-04-01

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  13. Computational Prediction of Protein Epsilon Lysine Acetylation Sites Based on a Feature Selection Method.

    PubMed

    Gao, JianZhao; Tao, Xue-Wen; Zhao, Jia; Feng, Yuan-Ming; Cai, Yu-Dong; Zhang, Ning

    2017-01-01

    Lysine acetylation, as one type of post-translational modifications (PTM), plays key roles in cellular regulations and can be involved in a variety of human diseases. However, it is often high-cost and time-consuming to use traditional experimental approaches to identify the lysine acetylation sites. Therefore, effective computational methods should be developed to predict the acetylation sites. In this study, we developed a position-specific method for epsilon lysine acetylation site prediction. Sequences of acetylated proteins were retrieved from the UniProt database. Various kinds of features such as position specific scoring matrix (PSSM), amino acid factors (AAF), and disorders were incorporated. A feature selection method based on mRMR (Maximum Relevance Minimum Redundancy) and IFS (Incremental Feature Selection) was employed. Finally, 319 optimal features were selected from total 541 features. Using the 319 optimal features to encode peptides, a predictor was constructed based on dagging. As a result, an accuracy of 69.56% with MCC of 0.2792 was achieved. We analyzed the optimal features, which suggested some important factors determining the lysine acetylation sites. We developed a position-specific method for epsilon lysine acetylation site prediction. A set of optimal features was selected. Analysis of the optimal features provided insights into the mechanism of lysine acetylation sites, providing guidance of experimental validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Consequences of least tern (Sternula antillarum) microhabitat nest-site selection on natural and mechanically constructed sandbars in the Missouri River

    USGS Publications Warehouse

    Stucker, Jennifer H.; Buhl, Deborah A.; Sherfy, Mark H.

    2013-01-01

    Nest-habitat selection in colonial species has rarely been assessed at multiple spatial scales to evaluate its fitness consequences. Management for the federally endangered U.S. Interior population of Least Terns (Sternula antillarum) has focused on maintenance of breeding habitats, including mechanical construction of sandbars from dredged material. Least Terns are attracted to large areas of unvegetated substrate, yet small-scale habitat features are thought to trigger selection for nesting. We evaluated nest-scale habitat selection to determine (1) whether selection differs between constructed and natural sandbars and (2) the subsequent consequences of habitat selection on nest success. During 2006–2008, we examined 869 Least Tern nest sites on constructed and natural sandbars in the Missouri River for evidence of microhabitat selection at the nest in relation to habitat within the surrounding 3-m area. Least Tern nest sites had coarser and larger substrate materials at the nest, more debris, and less vegetation than the surrounding area. Nests in constructed habitats had a greater percentage of coarse substrates and less vegetation or debris than nests in naturally created habitats. Apparent nest success was 1.8× greater on constructed than on natural sandbars. Nest success was best predicted by models with two spatial scales of predictors, including substrates (nest) and vegetation and debris (nest or surrounding area). Our results indicate that Least Terns select nest microhabitat characteristics that are associated with wind- and water-scoured habitats, and that nest success increases when these habitats are selected.

  15. All-cause mortality of elderly Australian veterans using COX-2 selective or non-selective NSAIDs: a longitudinal study

    PubMed Central

    Kerr, Stephen J; Rowett, Debra S; Sayer, Geoffrey P; Whicker, Susan D; Saltman, Deborah C; Mant, Andrea

    2011-01-01

    AIM To determine hazard ratios for all-cause mortality in elderly Australian veterans taking COX-2 selective and non-selective NSAIDs. METHODS Patient cohorts were constructed from claims databases (1997 to 2007) for veterans and dependants with full treatment entitlement irrespective of military service. Patients were grouped by initial exposure: celecoxib, rofecoxib, meloxicam, diclofenac, non-selective NSAID. A reference group was constructed of patients receiving glaucoma/hypothyroid medications and none of the study medications. Univariate and multivariate analyses were performed using Cox proportional hazards regression models. Hazard ratios (HR) and 95% confidence intervals (CI) were estimated for each exposure group against each of the reference group. The final model was adjusted for age, gender and co-prescription as a surrogate for cardiovascular risk. Patients were censored if the gap in supply of study prescription exceeded 30 days or if another study medication was initiated. The outcome measure in all analyses was death. RESULTS Hazard ratios and 95% CIs, adjusted for age, gender and cardiovascular risk, for each group relative to the reference group were: celecoxib 1.39 (1.25, 1.55), diclofenac 1.44 (1.28, 1.62), meloxicam 1.49 (1.25, 1.78), rofecoxib 1.58 (1.39, 1.79), non-selective NSAIDs 1.76 (1.59, 1.94). CONCLUSIONS In this large cohort of Australian veterans exposed to COX-2 selective and non-selective NSAIDs, there was a significant increased mortality risk for those exposed to either COX-2-selective or non-selective NSAIDs relative to those exposed to unrelated (glaucoma/hypothyroid) medications. PMID:21276041

  16. Plate-based diversity subset screening generation 2: an improved paradigm for high-throughput screening of large compound files.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Loesel, Jens; McLoughlin, David; Mills, James; Peakman, Marie-Claire; Sharp, Robert E; Williams, Christine; Zhu, Hongyao

    2016-11-01

    High-throughput screening (HTS) is an effective method for lead and probe discovery that is widely used in industry and academia to identify novel chemical matter and to initiate the drug discovery process. However, HTS can be time consuming and costly and the use of subsets as an efficient alternative to screening entire compound collections has been investigated. Subsets may be selected on the basis of chemical diversity, molecular properties, biological activity diversity or biological target focus. Previously, we described a novel form of subset screening: plate-based diversity subset (PBDS) screening, in which the screening subset is constructed by plate selection (rather than individual compound cherry-picking), using algorithms that select for compound quality and chemical diversity on a plate basis. In this paper, we describe a second-generation approach to the construction of an updated subset: PBDS2, using both plate and individual compound selection, that has an improved coverage of the chemical space of the screening file, whilst only selecting the same number of plates for screening. We describe the validation of PBDS2 and its successful use in hit and lead discovery. PBDS2 screening became the default mode of singleton (one compound per well) HTS for lead discovery in Pfizer.

  17. Data collected to support monitoring of constructed emergent sandbar habitat on the Missouri River downstream from Gavins Point Dam, South Dakota and Nebraska, 2004-06

    USGS Publications Warehouse

    Thompson, Ryan F.; Johnson, Michaela R.; Andersen, Michael J.

    2007-01-01

    The U.S. Army Corps of Engineers has constructed emergent sandbar habitat on sections of the Missouri River bordering South Dakota and Nebraska downstream from Gavins Point Dam to create and enhance habitat for threatened and endangered bird species. Two areas near river miles 761.3 and 769.8 were selected for construction of emergent sandbar habitat. Pre- and postconstruction data were collected by the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, to evaluate the success of the habitat management techniques. Data collected include pre- and postconstruction channel-geometry data (bathymetric and topographic) for areas upstream from, downstream from, and within each construction site. Water-velocity data were collected for selected parts of the site near river mile 769.8. Instruments and methods used in data collection, as well as quality-assurance and quality-control measures, are described. Geospatial channel-geometry data are presented for transects of the river channel as cross sections and as geographical information system shapefiles. Geospatial land-surface elevation data are provided for part of each site in the form of a color-shaded relief map. Geospatial water-velocity data also are provided as color-shaded maps and geographical information system shapefiles.

  18. Shuttle user analysis (study 2.2). Volume 4: Standardized subsystem modules analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The capability to analyze payloads constructed of standardized modules was provided for the planning of future mission models. An inventory of standardized module designs previously obtained was used as a starting point. Some of the conclusions and recommendations are: (1) the two growth factor synthesis methods provide logical configurations for satellite type selection; (2) the recommended method is the one that determines the growth factor as a function of the baseline subsystem weight, since it provides a larger growth factor for small subsystem weights and results in a greater overkill due to standardization; (3) the method that is not recommended is the one that depends upon a subsystem similarity selection, since care must be used in the subsystem similarity selection; (4) it is recommended that the application of standardized subsystem factors be limited to satellites with baseline dry weights between about 700 and 6,500 lbs; and (5) the standardized satellite design approach applies to satellites maintainable in orbit or retrieved for ground maintenance.

  19. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  20. Array-Based Discovery of Aptamer Pairs

    DTIC Science & Technology

    2014-12-11

    affinities greatly exceeding either monovalent component. DNA aptamers are especially well-suited for such constructs, because they can be linked via...standard synthesis techniques without requiring chemical conjugation. Unfortunately, aptamer pairs are difficult to generate, primarily because...conventional selection methods preferentially yield aptamers that recognize a dominant “hot spot” epitope. Our 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  1. Where Have All the Indians Gone? American Indian Representation in Secondary History Textbooks

    ERIC Educational Resources Information Center

    Shadowwalker, Depree M.

    2012-01-01

    This dissertation used a mixed method to develop an analytical model from a random selection of one of eight secondary history textbooks for instances of Indians to determine if the textual content: (1) constructs negative or inaccurate knowledge through word choice or narratives; (2) reinforces stereotype portraits; (3) omits similar minority…

  2. A Phenomenological Study of the Lived Experiences of Social Studies Teachers: Constructing Ideas about Democratic Citizenship and Teaching

    ERIC Educational Resources Information Center

    Thapa, Om Kumar

    2016-01-01

    The purpose of the study was to explore how social studies teachers conceptualized democracy, developed ideas about democratic citizenship, and implemented their perspectives and experiences into teaching. The study used phenomenological approach of qualitative research design. Six participants were selected using a convenient sampling method with…

  3. Tell Me Your Story: Analysis of Script Topics Selected by Persons with Aphasia

    ERIC Educational Resources Information Center

    Holland, Audrey L.; Halper, Anita S.; Cherney, Leora R.

    2010-01-01

    Purpose: This study examined the content of 100 short scripts, co-constructed by persons with aphasia (PWA) and a clinician. The PWA subsequently learned the scripts by interacting with a computerized virtual therapist. The goal was to provide clinicians with ideas regarding content for treatment that is meaningful to PWAs. Method: Thirty-three…

  4. Development of a Self-Report Tool to Evaluate Hearing Aid Outcomes among Chinese Speakers

    ERIC Educational Resources Information Center

    Wong, Lena L. N.; Hang, Na

    2014-01-01

    Purpose: This article reports on the development of a self-report tool--the Chinese Hearing Aid Outcomes Questionnaire (CHAOQ)--to evaluate hearing aid outcomes among Chinese speakers. Method: There were 4 phases to construct the CHAOQ and evaluate its psychometric properties. First, items were selected to evaluate a range of culturally relevant…

  5. A novel alternative method for 3D visualisation in Parasitology: the construction of a 3D model of a parasite from 2D illustrations.

    PubMed

    Teo, B G; Sarinder, K K S; Lim, L H S

    2010-08-01

    Three-dimensional (3D) models of the marginal hooks, dorsal and ventral anchors, bars and haptoral reservoirs of a parasite, Sundatrema langkawiense Lim & Gibson, 2009 (Monogenea) were developed using the polygonal modelling method in Autodesk 3ds Max (Version 9) based on two-dimensional (2D) illustrations. Maxscripts were written to rotate the modelled 3D structures. Appropriately orientated 3D haptoral hard-parts were then selected and positioned within the transparent 3D outline of the haptor and grouped together to form a complete 3D haptoral entity. This technique is an inexpensive tool for constructing 3D models from 2D illustrations for 3D visualisation of the spatial relationships between the different structural parts within organisms.

  6. Multi-objective evolutionary optimization for constructing neural networks for virtual reality visual data mining: application to geophysical prospecting.

    PubMed

    Valdés, Julio J; Barton, Alan J

    2007-05-01

    A method for the construction of virtual reality spaces for visual data mining using multi-objective optimization with genetic algorithms on nonlinear discriminant (NDA) neural networks is presented. Two neural network layers (the output and the last hidden) are used for the construction of simultaneous solutions for: (i) a supervised classification of data patterns and (ii) an unsupervised similarity structure preservation between the original data matrix and its image in the new space. A set of spaces are constructed from selected solutions along the Pareto front. This strategy represents a conceptual improvement over spaces computed by single-objective optimization. In addition, genetic programming (in particular gene expression programming) is used for finding analytic representations of the complex mappings generating the spaces (a composition of NDA and orthogonal principal components). The presented approach is domain independent and is illustrated via application to the geophysical prospecting of caves.

  7. Apparatus for measurements of thermal and optical stimulated exo-electron emission and luminescence

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Novotný, M.; Fitl, P.; Zuklín, J.; Vlček, J.; Nikl, J.; Marešová, E.; Hruška, P.; Bulíř, J.; Drahokoupil, J.; Čerňanský, M.; Lančok, J.

    2018-06-01

    The purpose of the design, construction and implementation of vacuum apparatus for measuring simultaneously three or more stimulated phenomena in dielectrics and eventually semiconductors is to investigate those phenomena as a function of temperature and wavelength. The test of equipment and its functionality were carried out step by step (apparatus, components and control sample) and associated with the calculation of the main physical parameters. The tests of individual parts of the apparatus clearly confirmed that the design, construction and selected components fulfil or even exceed the required properties. On the basis of the measurement of selected sample, it was shown that even weak signals from the material can be detected from both thermally stimulated luminescence and thermally stimulated exo-electron emission moreover transmission and desorption can be measured. NaCl:Ni (0.2%) was chosen as the test material. The activation energies and frequency factor were calculated using the methods of different authors.

  8. Descemet's Stripping Automated Endothelial Keratoplasty Tissue Insertion Devices

    PubMed Central

    Khan, Salman Nasir; Shiakolas, Panos S.; Mootha, Venkateswara Vinod

    2015-01-01

    This review study provides information regarding the construction, design, and use of six commercially available endothelial allograft insertion devices applied for Descemet's stripping automated endothelial keratoplasty (DSAEK). We also highlight issues being faced in DSAEK and discuss the methods through which medical devices such as corneal inserters may alleviate these issues. Inserter selection is of high importance in the DSAEK procedure since overcoming the learning curve associated with the use of an insertion device is a time and energy consuming process. In the present review, allograft insertion devices were compared in terms of design, construction material, insertion technique, dimensions, incision requirements and endothelial cell loss to show their relative merits and capabilities based on available data in the literature. Moreover, the advantages/disadvantages of various insertion devices used for allograft insertion in DSAEK are reviewed and compared. The information presented in this review can be utilized for better selection of an insertion device for DSAEK. PMID:27051492

  9. Analytical application of solid contact ion-selective electrodes for determination of copper and nitrate in various food products and drinking water.

    PubMed

    Wardak, Cecylia; Grabarczyk, Malgorzata

    2016-08-02

    A simple, fast and cheap method for monitoring copper and nitrate in drinking water and food products using newly developed solid contact ion-selective electrodes is proposed. Determination of copper and nitrate was performed by application of multiple standard additions technique. The reliability of the obtained results was assessed by comparing them using the anodic stripping voltammetry or spectrophotometry for the same samples. In each case, satisfactory agreement of the results was obtained, which confirms the analytical usefulness of the constructed electrodes.

  10. Site selection model for new metro stations based on land use

    NASA Astrophysics Data System (ADS)

    Zhang, Nan; Chen, Xuewu

    2015-12-01

    Since the construction of metro system generally lags behind the development of urban land use, sites of metro stations should adapt to their surrounding situations, which was rarely discussed by previous research on station layout. This paper proposes a new site selection model to find the best location for a metro station, establishing the indicator system based on land use and combining AHP with entropy weight method to obtain the schemes' ranking. The feasibility and efficiency of this model has been validated by evaluating Nanjing Shengtai Road station and other potential sites.

  11. Construction of an adaptable European transnational ecological deprivation index: the French version.

    PubMed

    Pornet, Carole; Delpierre, Cyrille; Dejardin, Olivier; Grosclaude, Pascale; Launay, Ludivine; Guittet, Lydia; Lang, Thierry; Launoy, Guy

    2012-11-01

    Studying social disparities in health implies the ability to measure them accurately, to compare them between different areas or countries and to follow trends over time. This study proposes a method for constructing a French European deprivation index, which will be replicable in several European countries and is related to an individual deprivation indicator constructed from a European survey specifically designed to study deprivation. Using individual data from the European Union Statistics on Income and Living Conditions survey, goods/services indicated by individuals as being fundamental needs, the lack of which reflect deprivation, were selected. From this definition, which is specific to a cultural context, an individual deprivation indicator was constructed by selecting fundamental needs associated both with objective and subjective poverty. Next, the authors selected among variables available both in the European Union Statistics on Income and Living Conditions survey and French national census those best reflecting individual experience of deprivation using multivariate logistic regression. An ecological measure of deprivation was provided for all the smallest French geographical units. Preliminary validation showed a higher association between the French European Deprivation Index (EDI) score and both income and education than the Townsend index, partly ensuring its ability to measure individual socioeconomic status. This index, which is specific to a particular cultural and social policy context, could be replicated in 25 other European countries, thereby allowing European comparisons. EDI could also be reproducible over time. EDI could prove to be a relevant tool in evidence-based policy-making for measuring and reducing social disparities in health issues and even outside the medical domain.

  12. Construction project selection with the use of fuzzy preference relation

    NASA Astrophysics Data System (ADS)

    Ibadov, Nabi

    2016-06-01

    In the article, author describes the problem of the construction project variant selection during pre-investment phase. As a solution, the algorithm basing on fuzzy preference relation is presented. The article provides an example of the algorithm used for selection of the best variant for construction project. The choice is made basing on criteria such as: net present value (NPV), level of technological difficulty, financing possibilities, and level of organizational difficulty.

  13. A hybrid learning method for constructing compact rule-based fuzzy models.

    PubMed

    Zhao, Wanqing; Niu, Qun; Li, Kang; Irwin, George W

    2013-12-01

    The Takagi–Sugeno–Kang-type rule-based fuzzy model has found many applications in different fields; a major challenge is, however, to build a compact model with optimized model parameters which leads to satisfactory model performance. To produce a compact model, most existing approaches mainly focus on selecting an appropriate number of fuzzy rules. In contrast, this paper considers not only the selection of fuzzy rules but also the structure of each rule premise and consequent, leading to the development of a novel compact rule-based fuzzy model. Here, each fuzzy rule is associated with two sets of input attributes, in which the first is used for constructing the rule premise and the other is employed in the rule consequent. A new hybrid learning method combining the modified harmony search method with a fast recursive algorithm is hereby proposed to determine the structure and the parameters for the rule premises and consequents. This is a hard mixed-integer nonlinear optimization problem, and the proposed hybrid method solves the problem by employing an embedded framework, leading to a significantly reduced number of model parameters and a small number of fuzzy rules with each being as simple as possible. Results from three examples are presented to demonstrate the compactness (in terms of the number of model parameters and the number of rules) and the performance of the fuzzy models obtained by the proposed hybrid learning method, in comparison with other techniques from the literature.

  14. Online selective kernel-based temporal difference learning.

    PubMed

    Chen, Xingguo; Gao, Yang; Wang, Ruili

    2013-12-01

    In this paper, an online selective kernel-based temporal difference (OSKTD) learning algorithm is proposed to deal with large scale and/or continuous reinforcement learning problems. OSKTD includes two online procedures: online sparsification and parameter updating for the selective kernel-based value function. A new sparsification method (i.e., a kernel distance-based online sparsification method) is proposed based on selective ensemble learning, which is computationally less complex compared with other sparsification methods. With the proposed sparsification method, the sparsified dictionary of samples is constructed online by checking if a sample needs to be added to the sparsified dictionary. In addition, based on local validity, a selective kernel-based value function is proposed to select the best samples from the sample dictionary for the selective kernel-based value function approximator. The parameters of the selective kernel-based value function are iteratively updated by using the temporal difference (TD) learning algorithm combined with the gradient descent technique. The complexity of the online sparsification procedure in the OSKTD algorithm is O(n). In addition, two typical experiments (Maze and Mountain Car) are used to compare with both traditional and up-to-date O(n) algorithms (GTD, GTD2, and TDC using the kernel-based value function), and the results demonstrate the effectiveness of our proposed algorithm. In the Maze problem, OSKTD converges to an optimal policy and converges faster than both traditional and up-to-date algorithms. In the Mountain Car problem, OSKTD converges, requires less computation time compared with other sparsification methods, gets a better local optima than the traditional algorithms, and converges much faster than the up-to-date algorithms. In addition, OSKTD can reach a competitive ultimate optima compared with the up-to-date algorithms.

  15. Gene-Specific Substitution Profiles Describe the Types and Frequencies of Amino Acid Changes during Antibody Somatic Hypermutation.

    PubMed

    Sheng, Zizhang; Schramm, Chaim A; Kong, Rui; Mullikin, James C; Mascola, John R; Kwong, Peter D; Shapiro, Lawrence

    2017-01-01

    Somatic hypermutation (SHM) plays a critical role in the maturation of antibodies, optimizing recognition initiated by recombination of V(D)J genes. Previous studies have shown that the propensity to mutate is modulated by the context of surrounding nucleotides and that SHM machinery generates biased substitutions. To investigate the intrinsic mutation frequency and substitution bias of SHMs at the amino acid level, we analyzed functional human antibody repertoires and developed mGSSP (method for gene-specific substitution profile), a method to construct amino acid substitution profiles from next-generation sequencing-determined B cell transcripts. We demonstrated that these gene-specific substitution profiles (GSSPs) are unique to each V gene and highly consistent between donors. We also showed that the GSSPs constructed from functional antibody repertoires are highly similar to those constructed from antibody sequences amplified from non-productively rearranged passenger alleles, which do not undergo functional selection. This suggests the types and frequencies, or mutational space, of a majority of amino acid changes sampled by the SHM machinery to be well captured by GSSPs. We further observed the rates of mutational exchange between some amino acids to be both asymmetric and context dependent and to correlate weakly with their biochemical properties. GSSPs provide an improved, position-dependent alternative to standard substitution matrices, and can be utilized to developing software for accurately modeling the SHM process. GSSPs can also be used for predicting the amino acid mutational space available for antigen-driven selection and for understanding factors modulating the maturation pathways of antibody lineages in a gene-specific context. The mGSSP method can be used to build, compare, and plot GSSPs; we report the GSSPs constructed for 69 common human V genes (DOI: 10.6084/m9.figshare.3511083) and provide high-resolution logo plots for each (DOI: 10.6084/m9.figshare.3511085).

  16. Boosted Regression Trees Outperforms Support Vector Machines in Predicting (Regional) Yields of Winter Wheat from Single and Cumulated Dekadal Spot-VGT Derived Normalized Difference Vegetation Indices

    NASA Astrophysics Data System (ADS)

    Stas, Michiel; Dong, Qinghan; Heremans, Stien; Zhang, Beier; Van Orshoven, Jos

    2016-08-01

    This paper compares two machine learning techniques to predict regional winter wheat yields. The models, based on Boosted Regression Trees (BRT) and Support Vector Machines (SVM), are constructed of Normalized Difference Vegetation Indices (NDVI) derived from low resolution SPOT VEGETATION satellite imagery. Three types of NDVI-related predictors were used: Single NDVI, Incremental NDVI and Targeted NDVI. BRT and SVM were first used to select features with high relevance for predicting the yield. Although the exact selections differed between the prefectures, certain periods with high influence scores for multiple prefectures could be identified. The same period of high influence stretching from March to June was detected by both machine learning methods. After feature selection, BRT and SVM models were applied to the subset of selected features for actual yield forecasting. Whereas both machine learning methods returned very low prediction errors, BRT seems to slightly but consistently outperform SVM.

  17. Constructing a bivariate distribution function with given marginals and correlation: application to the galaxy luminosity function

    NASA Astrophysics Data System (ADS)

    Takeuchi, Tsutomu T.

    2010-08-01

    We provide an analytic method to construct a bivariate distribution function (DF) with given marginal distributions and correlation coefficient. We introduce a convenient mathematical tool, called a copula, to connect two DFs with any prescribed dependence structure. If the correlation of two variables is weak (Pearson's correlation coefficient |ρ| < 1/3), the Farlie-Gumbel-Morgenstern (FGM) copula provides an intuitive and natural way to construct such a bivariate DF. When the linear correlation is stronger, the FGM copula cannot work anymore. In this case, we propose using a Gaussian copula, which connects two given marginals and is directly related to the linear correlation coefficient between two variables. Using the copulas, we construct the bivariate luminosity function (BLF) and discuss its statistical properties. We focus especially on the far-infrared-far-ulatraviolet (FUV-FIR) BLF, since these two wavelength regions are related to star-formation (SF) activity. Though both the FUV and FIR are related to SF activity, the univariate LFs have a very different functional form: the former is well described by the Schechter function whilst the latter has a much more extended power-law-like luminous end. We construct the FUV-FIR BLFs using the FGM and Gaussian copulas with different strengths of correlation, and examine their statistical properties. We then discuss some further possible applications of the BLF: the problem of a multiband flux-limited sample selection, the construction of the star-formation rate (SFR) function, and the construction of the stellar mass of galaxies (M*)-specific SFR (SFR/M*) relation. The copulas turn out to be a very useful tool to investigate all these issues, especially for including complicated selection effects.

  18. Incipient Fault Detection for Rolling Element Bearings under Varying Speed Conditions.

    PubMed

    Xue, Lang; Li, Naipeng; Lei, Yaguo; Li, Ningbo

    2017-06-20

    Varying speed conditions bring a huge challenge to incipient fault detection of rolling element bearings because both the change of speed and faults could lead to the amplitude fluctuation of vibration signals. Effective detection methods need to be developed to eliminate the influence of speed variation. This paper proposes an incipient fault detection method for bearings under varying speed conditions. Firstly, relative residual (RR) features are extracted, which are insensitive to the varying speed conditions and are able to reflect the degradation trend of bearings. Then, a health indicator named selected negative log-likelihood probability (SNLLP) is constructed to fuse a feature set including RR features and non-dimensional features. Finally, based on the constructed SNLLP health indicator, a novel alarm trigger mechanism is designed to detect the incipient fault. The proposed method is demonstrated using vibration signals from bearing tests and industrial wind turbines. The results verify the effectiveness of the proposed method for incipient fault detection of rolling element bearings under varying speed conditions.

  19. Incipient Fault Detection for Rolling Element Bearings under Varying Speed Conditions

    PubMed Central

    Xue, Lang; Li, Naipeng; Lei, Yaguo; Li, Ningbo

    2017-01-01

    Varying speed conditions bring a huge challenge to incipient fault detection of rolling element bearings because both the change of speed and faults could lead to the amplitude fluctuation of vibration signals. Effective detection methods need to be developed to eliminate the influence of speed variation. This paper proposes an incipient fault detection method for bearings under varying speed conditions. Firstly, relative residual (RR) features are extracted, which are insensitive to the varying speed conditions and are able to reflect the degradation trend of bearings. Then, a health indicator named selected negative log-likelihood probability (SNLLP) is constructed to fuse a feature set including RR features and non-dimensional features. Finally, based on the constructed SNLLP health indicator, a novel alarm trigger mechanism is designed to detect the incipient fault. The proposed method is demonstrated using vibration signals from bearing tests and industrial wind turbines. The results verify the effectiveness of the proposed method for incipient fault detection of rolling element bearings under varying speed conditions. PMID:28773035

  20. Roads and Airfields I (Programed Instruction). Engineer Subcourse 64-9.

    ERIC Educational Resources Information Center

    Army Engineer School, Fort Belvoir, VA.

    The document is a programed text for a correspondence course in the planning, construction, and maintenance of military roads and airfields. There are seven lessons: construction requirements and design criteria; road reconnaissance and site selection; airfield reconnaissance and site selection; layout procedures, construction staking, and field…

  1. 41 CFR 102-74.135 - Who selects construction and alteration projects that are to be performed?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and alteration projects that are to be performed? 102-74.135 Section 102-74.135 Public Contracts and... construction and alteration projects that are to be performed? The Administrator of General Services selects construction and alteration projects to be performed. ...

  2. Career Construction with a Gay Client: A Case Study

    ERIC Educational Resources Information Center

    Maree, Jacobus Gideon

    2014-01-01

    This article reports on the value of career construction counselling (CCC) with a gay person. The participant was selected purposively, with the selection criteria calling for a mid-career woman who had sought career counselling. The intervention involved administration of the "Career Construction Interview" (CCI) and the creation of a…

  3. Improving the spectral measurement accuracy based on temperature distribution and spectra-temperature relationship

    NASA Astrophysics Data System (ADS)

    Li, Zhe; Feng, Jinchao; Liu, Pengyu; Sun, Zhonghua; Li, Gang; Jia, Kebin

    2018-05-01

    Temperature is usually considered as a fluctuation in near-infrared spectral measurement. Chemometric methods were extensively studied to correct the effect of temperature variations. However, temperature can be considered as a constructive parameter that provides detailed chemical information when systematically changed during the measurement. Our group has researched the relationship between temperature-induced spectral variation (TSVC) and normalized squared temperature. In this study, we focused on the influence of temperature distribution in calibration set. Multi-temperature calibration set selection (MTCS) method was proposed to improve the prediction accuracy by considering the temperature distribution of calibration samples. Furthermore, double-temperature calibration set selection (DTCS) method was proposed based on MTCS method and the relationship between TSVC and normalized squared temperature. We compare the prediction performance of PLS models based on random sampling method and proposed methods. The results from experimental studies showed that the prediction performance was improved by using proposed methods. Therefore, MTCS method and DTCS method will be the alternative methods to improve prediction accuracy in near-infrared spectral measurement.

  4. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  5. Mining nutrigenetics patterns related to obesity: use of parallel multifactor dimensionality reduction.

    PubMed

    Karayianni, Katerina N; Grimaldi, Keith A; Nikita, Konstantina S; Valavanis, Ioannis K

    2015-01-01

    This paper aims to enlighten the complex etiology beneath obesity by analysing data from a large nutrigenetics study, in which nutritional and genetic factors associated with obesity were recorded for around two thousand individuals. In our previous work, these data have been analysed using artificial neural network methods, which identified optimised subsets of factors to predict one's obesity status. These methods did not reveal though how the selected factors interact with each other in the obtained predictive models. For that reason, parallel Multifactor Dimensionality Reduction (pMDR) was used here to further analyse the pre-selected subsets of nutrigenetic factors. Within pMDR, predictive models using up to eight factors were constructed, further reducing the input dimensionality, while rules describing the interactive effects of the selected factors were derived. In this way, it was possible to identify specific genetic variations and their interactive effects with particular nutritional factors, which are now under further study.

  6. Evaluation of accuracy of shade selection using two spectrophotometer systems: Vita Easyshade and Degudent Shadepilot.

    PubMed

    Kalantari, Mohammad Hassan; Ghoraishian, Seyed Ahmad; Mohaghegh, Mina

    2017-01-01

    The aim of this in vitro study was to evaluate the accuracy of shade matching using two spectrophotometric devices. Thirteen patients who require a full coverage restoration for one of their maxillary central incisors were selected while the adjacent central incisor was intact. 3 same frameworks were constructed for each tooth using computer-aided design and computer-aided manufacturing technology. Shade matching was performed using Vita Easyshade spectrophotometer, Shadepilot spectrophotometer, and Vitapan classical shade guide for the first, second, and third crown subsequently. After application, firing, and glazing of the porcelain, the color was evaluated and scored by five inspectors. Both spectrophotometric systems showed significantly better results than visual method ( P < 0.05) while there were no significant differences between Vita Easyshade and Shadepilot spectrophotometers ( P < 0.05). Spectrophotometers are a good substitute for visual color selection methods.

  7. Study on energy consumption evaluation of mountainous highway based on LCA

    NASA Astrophysics Data System (ADS)

    Fei, Lunlin; Zhang, Qi; Xie, Yongqing

    2017-06-01

    For the system to understand the road construction energy consumption process, this paper selects a typical mountainous highway in the south, using the theory and method of Life Cycle Assessment (LCA) to quantitatively study the energy consumption of the whole process of highway raw materials production, construction and operation. The results show that the energy consumption in the raw material production stage is the highest, followed by the highway operation and construction stage. The energy consumption per unit of tunnel engineering, bridge engineering, roadbed engineering and pavement engineering in the construction phase are 2279.00 tce, 1718.07 tce, 542.19 tce and 34.02 tce, and in operational phase, 85.44% of electricity consumption comes from tunnel ventilation and lighting. Therefore, in the bridge and tunnel construction process, we should promote energy-saving innovation of the construction technology and mechanical equipment, and further strengthen the research and development of tunnel ventilation, lighting energy-saving equipment and intelligent control technology, which will help significantly reduce the energy consumption and greenhouse gas emissions of the life cycle of highway.

  8. Automatic migraine classification via feature selection committee and machine learning techniques over imaging and questionnaire data.

    PubMed

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya; Gomez-Beldarrain, Marian; Fernandez-Ruanova, Begonya; Garcia-Monco, Juan Carlos

    2017-04-13

    Feature selection methods are commonly used to identify subsets of relevant features to facilitate the construction of models for classification, yet little is known about how feature selection methods perform in diffusion tensor images (DTIs). In this study, feature selection and machine learning classification methods were tested for the purpose of automating diagnosis of migraines using both DTIs and questionnaire answers related to emotion and cognition - factors that influence of pain perceptions. We select 52 adult subjects for the study divided into three groups: control group (15), subjects with sporadic migraine (19) and subjects with chronic migraine and medication overuse (18). These subjects underwent magnetic resonance with diffusion tensor to see white matter pathway integrity of the regions of interest involved in pain and emotion. The tests also gather data about pathology. The DTI images and test results were then introduced into feature selection algorithms (Gradient Tree Boosting, L1-based, Random Forest and Univariate) to reduce features of the first dataset and classification algorithms (SVM (Support Vector Machine), Boosting (Adaboost) and Naive Bayes) to perform a classification of migraine group. Moreover we implement a committee method to improve the classification accuracy based on feature selection algorithms. When classifying the migraine group, the greatest improvements in accuracy were made using the proposed committee-based feature selection method. Using this approach, the accuracy of classification into three types improved from 67 to 93% when using the Naive Bayes classifier, from 90 to 95% with the support vector machine classifier, 93 to 94% in boosting. The features that were determined to be most useful for classification included are related with the pain, analgesics and left uncinate brain (connected with the pain and emotions). The proposed feature selection committee method improved the performance of migraine diagnosis classifiers compared to individual feature selection methods, producing a robust system that achieved over 90% accuracy in all classifiers. The results suggest that the proposed methods can be used to support specialists in the classification of migraines in patients undergoing magnetic resonance imaging.

  9. Mining a clinical data warehouse to discover disease-finding associations using co-occurrence statistics

    PubMed Central

    Cao, Hui; Markatou, Marianthi; Melton, Genevieve B.; Chiang, Michael F.; Hripcsak, George

    2005-01-01

    This paper applies co-occurrence statistics to discover disease-finding associations in a clinical data warehouse. We used two methods, χ2 statistics and the proportion confidence interval (PCI) method, to measure the dependence of pairs of diseases and findings, and then used heuristic cutoff values for association selection. An intrinsic evaluation showed that 94 percent of disease-finding associations obtained by χ2 statistics and 76.8 percent obtained by the PCI method were true associations. The selected associations were used to construct knowledge bases of disease-finding relations (KB-χ2, KB-PCI). An extrinsic evaluation showed that both KB-χ2 and KB-PCI could assist in eliminating clinically non-informative and redundant findings from problem lists generated by our automated problem list summarization system. PMID:16779011

  10. Construct-level predictive validity of educational attainment and intellectual aptitude tests in medical student selection: meta-regression of six UK longitudinal studies.

    PubMed

    McManus, I C; Dewberry, Chris; Nicholson, Sandra; Dowell, Jonathan S; Woolf, Katherine; Potts, Henry W W

    2013-11-14

    Measures used for medical student selection should predict future performance during training. A problem for any selection study is that predictor-outcome correlations are known only in those who have been selected, whereas selectors need to know how measures would predict in the entire pool of applicants. That problem of interpretation can be solved by calculating construct-level predictive validity, an estimate of true predictor-outcome correlation across the range of applicant abilities. Construct-level predictive validities were calculated in six cohort studies of medical student selection and training (student entry, 1972 to 2009) for a range of predictors, including A-levels, General Certificates of Secondary Education (GCSEs)/O-levels, and aptitude tests (AH5 and UK Clinical Aptitude Test (UKCAT)). Outcomes included undergraduate basic medical science and finals assessments, as well as postgraduate measures of Membership of the Royal Colleges of Physicians of the United Kingdom (MRCP(UK)) performance and entry in the Specialist Register. Construct-level predictive validity was calculated with the method of Hunter, Schmidt and Le (2006), adapted to correct for right-censorship of examination results due to grade inflation. Meta-regression analyzed 57 separate predictor-outcome correlations (POCs) and construct-level predictive validities (CLPVs). Mean CLPVs are substantially higher (.450) than mean POCs (.171). Mean CLPVs for first-year examinations, were high for A-levels (.809; CI: .501 to .935), and lower for GCSEs/O-levels (.332; CI: .024 to .583) and UKCAT (mean = .245; CI: .207 to .276). A-levels had higher CLPVs for all undergraduate and postgraduate assessments than did GCSEs/O-levels and intellectual aptitude tests. CLPVs of educational attainment measures decline somewhat during training, but continue to predict postgraduate performance. Intellectual aptitude tests have lower CLPVs than A-levels or GCSEs/O-levels. Educational attainment has strong CLPVs for undergraduate and postgraduate performance, accounting for perhaps 65% of true variance in first year performance. Such CLPVs justify the use of educational attainment measure in selection, but also raise a key theoretical question concerning the remaining 35% of variance (and measurement error, range restriction and right-censorship have been taken into account). Just as in astrophysics, 'dark matter' and 'dark energy' are posited to balance various theoretical equations, so medical student selection must also have its 'dark variance', whose nature is not yet properly characterized, but explains a third of the variation in performance during training. Some variance probably relates to factors which are unpredictable at selection, such as illness or other life events, but some is probably also associated with factors such as personality, motivation or study skills.

  11. Methodical bases of selection and evaluation of the effectiveness of the projects of the urban territory renovation

    NASA Astrophysics Data System (ADS)

    Sizova, Evgeniya; Zhutaeva, Evgeniya; Chugunov, Andrei

    2018-03-01

    The article highlights features of processes of urban territory renovation from the perspective of a commercial entity participating in the implementation of a project. The requirements of high-rise construction projects to the entities, that carry out them, are considered. The advantages of large enterprises as participants in renovation projects are systematized, contributing to their most efficient implementation. The factors, which influence the success of the renovation projects, are presented. A method for selecting projects for implementation based on criteria grouped by qualitative characteristics and contributing to the most complete and comprehensive evaluation of the project is suggested. Patterns to prioritize and harmonize renovation projects in terms of multi-project activity of the enterprise are considered.

  12. A Study on the Rural Residence in the Northern Area of Zhejiang Province from the Perspective of Green Living Environment

    NASA Astrophysics Data System (ADS)

    Wang, J.; Gao, W. J.; Wang, C.

    2018-05-01

    At present, owing to the rapid development of rural construction, it lacks corresponding theories and practices and damages to the features of rural area, ignoring the geography, suitability and green living environment factors. The research selects rural residence as the object, defining “courtyard” as the basic unit for rural residence. It utilizes the principle of topology as the expanding media, by the method of principle of cellular structure and green living environment design strategy. The essay establishes the design and construction system of “rural basic unit”, combining functions and structures, prototype menu, chamber space and compound interface, from the perspective of green living environment. It aims to guide rural construction and protect the ruralliving environment.

  13. Progress in gene targeting and gene therapy for retinitis pigmentosa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, G.J.; Humphries, M.M.; Erven, A.

    1994-09-01

    Previously, we localized disease genes involved in retinitis pigmentosa (RP), an inherited retinal degeneration, close to the rhodopsin and peripherin genes on 3q and 6p. Subsequently, we and others identified mutations in these genes in RP patients. Currently animal models for human retinopathies are being generated using gene targeting by homologous recombination in embryonic stem (ES) cells. Genomic clones for retinal genes including rhodopsin and peripherin have been obtained from a phage library carrying mouse DNA isogenic with the ES cell line (CC1.2). The peripherin clone has been sequenced to establish the genomic structure of the mouse gene. Targeting vectorsmore » for rhodopsin and peripherin including a neomycin cassette for positive selection and thymidine kinase genes enabling selection against random intergrants are under construction. Progress in vector construction will be presented. Simultaneously we are developing systems for delivery of gene therapies to retinal tissues utilizing replication-deficient adenovirus (Ad5). Efficacy of infection subsequent to various methods of intraocular injection and with varying viral titers is being assayed using an adenovirus construct containing a CMV promoter LacZ fusion as reporter and the range of tissues infected and the level of duration of LacZ expression monitored. Viral constructs with the LacZ reporter gene under the control of retinal specific promoters such as rhodopsin and IRBP cloned into pXCJL.1 are under construction. An update on developments in photoreceptor cell-directed expression of virally delivered genes will be presented.« less

  14. Evaluation of food storage racks available on the Polish market in the hygienic context

    PubMed

    Grzesińska, Wiesława; Tomaszewska, Marzena; Bilska, Beata; Trafiałek, Joanna; Dziadek, Michał

    Providing safe food products to the consumer depends on the material and technology used and adherence to hygienic practices, throughout the production process. The degree of microbial contamination of a surface is an important indicator of equipment cleanliness and effectiveness of cleaning and disinfection. Used material, construction solutions and quality of the applied devices also have an effect on hygienic status. The objective of the present study was to evaluate the influence of the design and construction material of selected food storage racks, available on the Polish market, on their hygienic status. The study was based on determination of the capability of microbial growth on the surface of the racks and the effectiveness of their cleaning. Microbiological cleanliness on the surface of the racks was monitored by the contact plates which are able to estimate the total number of icroorganisms. Examination of effectiveness of cleaning was conducted by the use of ATP bioluminescence method. This experiment has proven a significant influence of adopted construction solutions on the hygienic status of the examined racks. Presence of antibacterial layer and a choice of the appropriate construction material characterized by a low surface roughness impedes the microbial growth and increases the effectiveness of cleaning. Design solutions have significant impact on the hygienic status of shelves. Selection of a suitable material for the construction of racks can greatly reduce the possibility of the development of microorganism, despite the low efficiency of the cleaning. The application of antimicrobial coatings inhibits microbial growth.

  15. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  16. Linkage of Recognition and Replication Functions by Assembling Combinatorial Antibody Fab Libraries Along Phage Surfaces

    NASA Astrophysics Data System (ADS)

    Kang, Angray S.; Barbas, Carlos F.; Janda, Kim D.; Benkovic, Stephen J.; Lerner, Richard A.

    1991-05-01

    We describe a method based on a phagemid vector with helper phage rescue for the construction and rapid analysis of combinatorial antibody Fab libraries. This approach should allow the generation and selection of many monoclonal antibodies. Antibody genes are expressed in concert with phage morphogenesis, thereby allowing incorporation of functional Fab molecules along the surface of filamentous phage. The power of the method depends upon the linkage of recognition and replication functions and is not limited to antibody molecules.

  17. Consecutive Plate Acoustic Suppressor Apparatus and Methods

    NASA Technical Reports Server (NTRS)

    Doychak, Joseph (Inventor); Parrott, Tony L. (Inventor)

    1993-01-01

    An apparatus and method for suppressing acoustic noise utilizes consecutive plates, closely spaced to each other so as to exploit dissipation associated with sound propagation in narrow channels to optimize the acoustic resistance at a liner surface. The closely spaced plates can be utilized as high temperature structural materials for jet engines by constructing the plates from composite materials. Geometries of the plates, such as plate depth, shape, thickness, inter-plate spacing, arrangement, etc., can be selected to achieve bulk material-like behavior.

  18. Fundamental Movement Skills Are More than Run, Throw and Catch: The Role of Stability Skills

    PubMed Central

    Rudd, James R.; Barnett, Lisa M.; Butson, Michael L.; Farrow, Damian; Berry, Jason; Polman, Remco C. J.

    2015-01-01

    Introduction In motor development literature fundamental movement skills are divided into three constructs: locomotive, object control and stability skills. Most fundamental movement skills research has focused on children’s competency in locomotor and object control skills. The first aim of this study was to validate a test battery to assess the construct of stability skills, in children aged 6 to 10 (M age = 8.2, SD = 1.2). Secondly we assessed how the stability skills construct fitted into a model of fundamental movement skill. Method The Delphi method was used to select the stability skill battery. Confirmatory factor analysis (CFA) was used to assess if the skills loaded onto the same construct and a new model of FMS was developed using structural equation modelling. Results Three postural control tasks were selected (the log roll, rock and back support) because they had good face and content validity. These skills also demonstrated good predictive validity with gymnasts scoring significantly better than children without gymnastic training and children from a high SES school performing better than those from a mid and low SES schools and the mid SES children scored better than the low SES children (all p < .05). Inter rater reliability tests were excellent for all three skills (ICC = 0.81, 0.87, 0.87) as was test re-test reliability (ICC 0.87–0.95). CFA provided good construct validity, and structural equation modelling revealed stability skills to be an independent factor in an overall FMS model which included locomotor (r = .88), object control (r = .76) and stability skills (r = .81). Discussion This study provides a rationale for the inclusion of stability skills in FMS assessment. The stability skills could be used alongside other FMS assessment tools to provide a holistic assessment of children’s fundamental movement skills. PMID:26468644

  19. Imaging Strategies for Tissue Engineering Applications

    PubMed Central

    Nam, Seung Yun; Ricles, Laura M.; Suggs, Laura J.

    2015-01-01

    Tissue engineering has evolved with multifaceted research being conducted using advanced technologies, and it is progressing toward clinical applications. As tissue engineering technology significantly advances, it proceeds toward increasing sophistication, including nanoscale strategies for material construction and synergetic methods for combining with cells, growth factors, or other macromolecules. Therefore, to assess advanced tissue-engineered constructs, tissue engineers need versatile imaging methods capable of monitoring not only morphological but also functional and molecular information. However, there is no single imaging modality that is suitable for all tissue-engineered constructs. Each imaging method has its own range of applications and provides information based on the specific properties of the imaging technique. Therefore, according to the requirements of the tissue engineering studies, the most appropriate tool should be selected among a variety of imaging modalities. The goal of this review article is to describe available biomedical imaging methods to assess tissue engineering applications and to provide tissue engineers with criteria and insights for determining the best imaging strategies. Commonly used biomedical imaging modalities, including X-ray and computed tomography, positron emission tomography and single photon emission computed tomography, magnetic resonance imaging, ultrasound imaging, optical imaging, and emerging techniques and multimodal imaging, will be discussed, focusing on the latest trends of their applications in recent tissue engineering studies. PMID:25012069

  20. Validity and Reliability of Psychosocial Factors Related to Breast Cancer Screening.

    ERIC Educational Resources Information Center

    Zapka, Jane G.; And Others

    1991-01-01

    The construct validity of hypothesized survey items and data reduction procedures for selected psychosocial constructs frequently used in breast cancer screening research were investigated in telephone interviews with randomly selected samples of 1,184 and 903 women and a sample of 169 Hispanic clinic clients. Validity of the constructs is…

  1. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    PubMed

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  2. Construct Validity and Reliability of the Tolerance Scale among Iranian College Students

    ERIC Educational Resources Information Center

    Ersanli, Ercümend; Mameghani, Shiva Saeighi

    2016-01-01

    In the present study, the Tolerance Scale developed by Ersanli (2014) was adapted to the Iranian culture, and its validity and reliability were investigated in the case of Iranian college students. The participants consisted of 552 Iranian college students (62% male, M = 20.84, S.D.: 1.53) selected using the convenience sampling method. The sample…

  3. Horizontal directional drilling: a green and sustainable technology for site remediation.

    PubMed

    Lubrecht, Michael D

    2012-03-06

    Sustainability has become an important factor in the selection of remedies to clean up contaminated sites. Horizontal directional drilling (HDD) is a relatively new drilling technology that has been successfully adapted to site remediation. In addition to the benefits that HDD provides for the logistics of site cleanup, it also delivers sustainability advantages, compared to alternative construction methods.

  4. An infrared spectral database for detection of gases emitted by biomass burning

    Treesearch

    Timothy J. Johnson; Luisa T. M. Profeta; Robert L. Sams; David W. T. Griffith; Robert L. Yokelson

    2010-01-01

    We report the construction of a database of infrared spectra aimed at detecting the gases emitted by biomass burning. The project uses many of the methods of the Pacific Northwest National Laboratory (PNNL) infrared database, but the selection of the species and special experimental considerations are optimized. Each spectrum is a weighted average derived from 10 or...

  5. Evaluation of English Achievement Test: A Comparison between High and Low Achievers amongst Selected Elementary School Students of Pakistan

    ERIC Educational Resources Information Center

    Haider, Zubair; Latif, Farah; Akhtar, Samina; Mushtaq, Maria

    2012-01-01

    Validity, reliability and item analysis are critical to the process of evaluating the quality of an educational measurement. The present study evaluates the quality of an assessment constructed to measure elementary school student's achievement in English. In this study, the survey model of descriptive research was used as a research method.…

  6. Differential Distractor Functioning as a Method for Explaining DIF: The Case of a National Admissions Test in Saudi Arabia

    ERIC Educational Resources Information Center

    Tsaousis, Ioannis; Sideridis, Georgios; Al-Saawi, Fahad

    2018-01-01

    The aim of the present study was to examine Differential Distractor Functioning (DDF) as a means of improving the quality of a measure through understanding biased responses across groups. A DDF analysis could shed light on the potential sources of construct-irrelevant variance by examining whether the differential selection of incorrect choices…

  7. Modification of the Integrated Sasang Constitutional Diagnostic Model

    PubMed Central

    Nam, Jiho

    2017-01-01

    In 2012, the Korea Institute of Oriental Medicine proposed an objective and comprehensive physical diagnostic model to address quantification problems in the existing Sasang constitutional diagnostic method. However, certain issues have been raised regarding a revision of the proposed diagnostic model. In this paper, we propose various methodological approaches to address the problems of the previous diagnostic model. Firstly, more useful variables are selected in each component. Secondly, the least absolute shrinkage and selection operator is used to reduce multicollinearity without the modification of explanatory variables. Thirdly, proportions of SC types and age are considered to construct individual diagnostic models and classify the training set and the test set for reflecting the characteristics of the entire dataset. Finally, an integrated model is constructed with explanatory variables of individual diagnosis models. The proposed integrated diagnostic model significantly improves the sensitivities for both the male SY type (36.4% → 62.0%) and the female SE type (43.7% → 64.5%), which were areas of limitation of the previous integrated diagnostic model. The ideas of these new algorithms are expected to contribute not only to the scientific development of Sasang constitutional medicine in Korea but also to that of other diagnostic methods for traditional medicine. PMID:29317897

  8. Constructing Compact Takagi-Sugeno Rule Systems: Identification of Complex Interactions in Epidemiological Data

    PubMed Central

    Zhou, Shang-Ming; Lyons, Ronan A.; Brophy, Sinead; Gravenor, Mike B.

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data. PMID:23272108

  9. Constructing compact Takagi-Sugeno rule systems: identification of complex interactions in epidemiological data.

    PubMed

    Zhou, Shang-Ming; Lyons, Ronan A; Brophy, Sinead; Gravenor, Mike B

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data.

  10. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 8 2013-07-01 2013-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  11. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 8 2012-07-01 2012-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  12. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 8 2014-07-01 2014-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  13. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 8 2011-07-01 2011-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  14. Study directed at development of an implantable biotelemetry ion detector

    NASA Technical Reports Server (NTRS)

    Hanley, L. D.; Kress, D.

    1971-01-01

    A literature search was conducted to currently update known information in the field of ion-selective electrodes. The review attempts to identify present trends in cation and anions selective electrodes pertinent to the area of bioimplantable units. An electronic circuit was designed to provide the high impedance interface between the ion-selective sensors and signal-processing equipment. The resulting design emphasized the need for low power and miniaturization. Many of the circuits were constructed and used to evaluate the ion-selective electrodes. A cuvette capable of holding the ion-selective and the reference electrodes was designed and constructed. This equipment was used to evaluate commercially available ion-selective electrodes and the electrodes designed and constructed in the study. The results of the electrode tests are included.

  15. Structure and weights optimisation of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: a comparative study

    NASA Astrophysics Data System (ADS)

    Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood

    2015-10-01

    Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.

  16. The Construct of Attention in Schizophrenia

    PubMed Central

    Luck, Steven J.; Gold, James M.

    2008-01-01

    Schizophrenia is widely thought to involve deficits of attention. However, the term attention can be defined so broadly that impaired performance on virtually any task could be construed as evidence for a deficit in attention, and this has slowed cumulative progress in understanding attention deficits in schizophrenia. To address this problem, we divide the general concept of attention into two distinct constructs: input selection, the selection of task-relevant inputs for further processing; and rule selection, the selective activation of task-appropriate rules. These constructs are closely tied to working memory, because input selection mechanisms are used to control the transfer of information into working memory and because working memory stores the rules used by rule selection mechanisms. These constructs are also closely tied to executive function, because executive systems are used to guide input selection and because rule selection is itself at key aspect of executive function. Within the domain of input selection, it is important to distinguish between the control of selection—the processes that guide attention to task-relevant inputs—and the implementation of selection—the processes that enhance the processing of the relevant inputs and suppress the irrelevant inputs. Current evidence suggests that schizophrenia involves a significant impairment in the control of selection but little or no impairment in the implementation of selection. Consequently, the CNTRICS participants agreed by consensus that attentional control should be a priority target for measurement and treatment research in schizophrenia. PMID:18374901

  17. A versatile and efficient high-throughput cloning tool for structural biology.

    PubMed

    Geertsma, Eric R; Dutzler, Raimund

    2011-04-19

    Methods for the cloning of large numbers of open reading frames into expression vectors are of critical importance for challenging structural biology projects. Here we describe a system termed fragment exchange (FX) cloning that facilitates the high-throughput generation of expression constructs. The method is based on a class IIS restriction enzyme and negative selection markers. FX cloning combines attractive features of established recombination- and ligation-independent cloning methods: It allows the straightforward transfer of an open reading frame into a variety of expression vectors and is highly efficient and very economic in its use. In addition, FX cloning avoids the common but undesirable feature of significantly extending target open reading frames with cloning related sequences, as it leaves a minimal seam of only a single extra amino acid to either side of the protein. The method has proven to be very robust and suitable for all common pro- and eukaryotic expression systems. It considerably speeds up the generation of expression constructs compared to traditional methods and thus facilitates a broader expression screening.

  18. Nucleic acid constructs containing orthogonal site selective recombinases (OSSRs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilmore, Joshua M.; Anderson, J. Christopher; Dueber, John E.

    The present invention provides for a recombinant nucleic acid comprising a nucleotide sequence comprising a plurality of constructs, wherein each construct independently comprises a nucleotide sequence of interest flanked by a pair of recombinase recognition sequences. Each pair of recombinase recognition sequences is recognized by a distinct recombinase. Optionally, each construct can, independently, further comprise one or more genes encoding a recombinase capable of recognizing the pair of recombinase recognition sequences of the construct. The recombinase can be an orthogonal (non-cross reacting), site-selective recombinase (OSSR).

  19. Genetic selection for a highly functional cysteine-less membrane protein using site-saturation mutagenesis

    PubMed Central

    Arendt, Cassandra S.; Ri, Keirei; Yates, Phillip A.; Ullman, Buddy

    2007-01-01

    We describe an efficient method for generating highly functional membrane proteins with variant amino acids at defined positions that couples a modified site-saturation strategy with functional genetic selection. We applied this method to the production of a cysteine-less variant of the Crithidia fasciculata inosine-guanosine permease CfNT2, in order to facilitate biochemical studies using thiol-specific modifying reagents. Of ten endogenous cysteine residues in CfNT2, two cannot be replaced with serine or alanine without loss of function. High-quality single- and double-mutant libraries were produced by combining a previously reported site-saturation mutagenesis scheme based on the Quikchange method with a novel gel purification step that effectively eliminated template DNA from the products. Following selection for functional complementation in S. cerevisiae cells auxotrophic for purines, several highly functional non-cysteine substitutions were efficiently identified at each desired position, allowing the construction of cysteine-less variants of CfNT2 that retained wild-type affinity for inosine. This combination of an improved site-saturation mutagenesis technique and positive genetic selection provides a simple and efficient means to identify functional and perhaps unexpected amino acid variants at a desired position. PMID:17481563

  20. Cloud decision model for selecting sustainable energy crop based on linguistic intuitionistic information

    NASA Astrophysics Data System (ADS)

    Peng, Hong-Gang; Wang, Jian-Qiang

    2017-11-01

    In recent years, sustainable energy crop has become an important energy development strategy topic in many countries. Selecting the most sustainable energy crop is a significant problem that must be addressed during any biofuel production process. The focus of this study is the development of an innovative multi-criteria decision-making (MCDM) method to handle sustainable energy crop selection problems. Given that various uncertain data are encountered in the evaluation of sustainable energy crops, linguistic intuitionistic fuzzy numbers (LIFNs) are introduced to present the information necessary to the evaluation process. Processing qualitative concepts requires the effective support of reliable tools; then, a cloud model can be used to deal with linguistic intuitionistic information. First, LIFNs are converted and a novel concept of linguistic intuitionistic cloud (LIC) is proposed. The operations, score function and similarity measurement of the LICs are defined. Subsequently, the linguistic intuitionistic cloud density-prioritised weighted Heronian mean operator is developed, which served as the basis for the construction of an applicable MCDM model for sustainable energy crop selection. Finally, an illustrative example is provided to demonstrate the proposed method, and its feasibility and validity are further verified by comparing it with other existing methods.

  1. Total sulfur determination in residues of crude oil distillation using FT-IR/ATR and variable selection methods

    NASA Astrophysics Data System (ADS)

    Müller, Aline Lima Hermes; Picoloto, Rochele Sogari; Mello, Paola de Azevedo; Ferrão, Marco Flores; dos Santos, Maria de Fátima Pereira; Guimarães, Regina Célia Lourenço; Müller, Edson Irineu; Flores, Erico Marlon Moraes

    2012-04-01

    Total sulfur concentration was determined in atmospheric residue (AR) and vacuum residue (VR) samples obtained from petroleum distillation process by Fourier transform infrared spectroscopy with attenuated total reflectance (FT-IR/ATR) in association with chemometric methods. Calibration and prediction set consisted of 40 and 20 samples, respectively. Calibration models were developed using two variable selection models: interval partial least squares (iPLS) and synergy interval partial least squares (siPLS). Different treatments and pre-processing steps were also evaluated for the development of models. The pre-treatment based on multiplicative scatter correction (MSC) and the mean centered data were selected for models construction. The use of siPLS as variable selection method provided a model with root mean square error of prediction (RMSEP) values significantly better than those obtained by PLS model using all variables. The best model was obtained using siPLS algorithm with spectra divided in 20 intervals and combinations of 3 intervals (911-824, 823-736 and 737-650 cm-1). This model produced a RMSECV of 400 mg kg-1 S and RMSEP of 420 mg kg-1 S, showing a correlation coefficient of 0.990.

  2. Technical Competencies Applied in Experimental Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Tagg, Randall

    2017-11-01

    The practical design, construction, and operation of fluid dynamics experiments require a broad range of competencies. Three types are instrumental, procedural, and design. Respective examples would be operation of a spectrum analyzer, soft-soldering or brazing flow plumbing, and design of a small wind tunnel. Some competencies, such as the selection and installation of pumping systems, are unique to fluid dynamics and fluids engineering. Others, such as the design and construction of electronic amplifiers or optical imaging systems, overlap with other fields. Thus the identification and development of learning materials and methods for instruction are part of a larger effort to identify competencies needed in active research and technical innovation.

  3. Oxygen ion-beam microlithography

    DOEpatents

    Tsuo, Y.S.

    1991-08-20

    A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used. 5 figures.

  4. The influence of construction measurement and structure storey on seismic performance of masonry structure

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhao, Hexian; Yan, Peilei

    2017-08-01

    The damage of masonry structures in earthquakes is generally more severe than other structures. Through the analysis of two typical earthquake damage buildings in the Wenchuan earthquake in Xuankou middle school, we found that the number of storeys and the construction measures had great influence on the seismic performance of masonry structures. This paper takes a teachers’ dormitory in Xuankou middle school as an example, selected the structure arrangement and storey number as two independent variables to design working conditions. Finally we researched on the seismic performance difference of masonry structure under two variables by finite element analysis method.

  5. Oxygen ion-beam microlithography

    DOEpatents

    Tsuo, Y. Simon

    1991-01-01

    A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used.

  6. Preclinical safety assessments of nano-sized constructs on cardiovascular system toxicity: A case for telemetry.

    PubMed

    Cheah, Hoay Yan; Kiew, Lik Voon; Lee, Hong Boon; Japundžić-Žigon, Nina; Vicent, Marίa J; Hoe, See Ziau; Chung, Lip Yong

    2017-11-01

    While nano-sized construct (NSC) use in medicine has grown significantly in recent years, reported unwanted side effects have raised safety concerns. However, the toxicity of NSCs to the cardiovascular system (CVS) and the relative merits of the associated evaluation methods have not been thoroughly studied. This review discusses the toxicological profiles of selected NSCs and provides an overview of the assessment methods, including in silico, in vitro, ex vivo and in vivo models and how they are related to CVS toxicity. We conclude the review by outlining the merits of telemetry coupled with spectral analysis, baroreceptor reflex sensitivity analysis and echocardiography as an appropriate integrated strategy for the assessment of the acute and chronic impact of NSCs on the CVS. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Trajectory of Sewerage System Development Optimization

    NASA Astrophysics Data System (ADS)

    Chupin, R. V.; Mayzel, I. V.; Chupin, V. R.

    2017-11-01

    The transition to market relations has determined a new technology for our country to manage the development of urban engineering systems. This technology has shifted to the municipal level and it can, in large, be presented in two stages. The first is the development of a scheme for the development of the water supply and sanitation system, the second is the implementation of this scheme on the basis of investment programs of utilities. In the investment programs, financial support is provided for the development and reconstruction of water disposal systems due to the investment component in the tariff, connection fees for newly commissioned capital construction projects and targeted financing for selected state and municipal programs, loans and credits. Financial provision with the development of sewerage systems becomes limited and the problem arises in their rational distribution between the construction of new water disposal facilities and the reconstruction of existing ones. The paper suggests a methodology for developing options for the development of sewerage systems, selecting the best of them by the life cycle cost criterion, taking into account the limited investments in their construction, models and methods of analysis, optimizing their reconstruction and development, taking into account reliability and seismic resistance.

  8. Making the Most of Your School Site. School Buildings Planning, Design, and Construction Series No. 2.

    ERIC Educational Resources Information Center

    Odell, John H.

    A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on site selection covering selection criteria; traffic issues; and site services, such as water, power, and sewer. Additionally…

  9. Adaptive feature selection using v-shaped binary particle swarm optimization.

    PubMed

    Teng, Xuyang; Dong, Hongbin; Zhou, Xiurong

    2017-01-01

    Feature selection is an important preprocessing method in machine learning and data mining. This process can be used not only to reduce the amount of data to be analyzed but also to build models with stronger interpretability based on fewer features. Traditional feature selection methods evaluate the dependency and redundancy of features separately, which leads to a lack of measurement of their combined effect. Moreover, a greedy search considers only the optimization of the current round and thus cannot be a global search. To evaluate the combined effect of different subsets in the entire feature space, an adaptive feature selection method based on V-shaped binary particle swarm optimization is proposed. In this method, the fitness function is constructed using the correlation information entropy. Feature subsets are regarded as individuals in a population, and the feature space is searched using V-shaped binary particle swarm optimization. The above procedure overcomes the hard constraint on the number of features, enables the combined evaluation of each subset as a whole, and improves the search ability of conventional binary particle swarm optimization. The proposed algorithm is an adaptive method with respect to the number of feature subsets. The experimental results show the advantages of optimizing the feature subsets using the V-shaped transfer function and confirm the effectiveness and efficiency of the feature subsets obtained under different classifiers.

  10. Adaptive feature selection using v-shaped binary particle swarm optimization

    PubMed Central

    Dong, Hongbin; Zhou, Xiurong

    2017-01-01

    Feature selection is an important preprocessing method in machine learning and data mining. This process can be used not only to reduce the amount of data to be analyzed but also to build models with stronger interpretability based on fewer features. Traditional feature selection methods evaluate the dependency and redundancy of features separately, which leads to a lack of measurement of their combined effect. Moreover, a greedy search considers only the optimization of the current round and thus cannot be a global search. To evaluate the combined effect of different subsets in the entire feature space, an adaptive feature selection method based on V-shaped binary particle swarm optimization is proposed. In this method, the fitness function is constructed using the correlation information entropy. Feature subsets are regarded as individuals in a population, and the feature space is searched using V-shaped binary particle swarm optimization. The above procedure overcomes the hard constraint on the number of features, enables the combined evaluation of each subset as a whole, and improves the search ability of conventional binary particle swarm optimization. The proposed algorithm is an adaptive method with respect to the number of feature subsets. The experimental results show the advantages of optimizing the feature subsets using the V-shaped transfer function and confirm the effectiveness and efficiency of the feature subsets obtained under different classifiers. PMID:28358850

  11. Wrapper-based selection of genetic features in genome-wide association studies through fast matrix operations

    PubMed Central

    2012-01-01

    Background Through the wealth of information contained within them, genome-wide association studies (GWAS) have the potential to provide researchers with a systematic means of associating genetic variants with a wide variety of disease phenotypes. Due to the limitations of approaches that have analyzed single variants one at a time, it has been proposed that the genetic basis of these disorders could be determined through detailed analysis of the genetic variants themselves and in conjunction with one another. The construction of models that account for these subsets of variants requires methodologies that generate predictions based on the total risk of a particular group of polymorphisms. However, due to the excessive number of variants, constructing these types of models has so far been computationally infeasible. Results We have implemented an algorithm, known as greedy RLS, that we use to perform the first known wrapper-based feature selection on the genome-wide level. The running time of greedy RLS grows linearly in the number of training examples, the number of features in the original data set, and the number of selected features. This speed is achieved through computational short-cuts based on matrix calculus. Since the memory consumption in present-day computers can form an even tighter bottleneck than running time, we also developed a space efficient variation of greedy RLS which trades running time for memory. These approaches are then compared to traditional wrapper-based feature selection implementations based on support vector machines (SVM) to reveal the relative speed-up and to assess the feasibility of the new algorithm. As a proof of concept, we apply greedy RLS to the Hypertension – UK National Blood Service WTCCC dataset and select the most predictive variants using 3-fold external cross-validation in less than 26 minutes on a high-end desktop. On this dataset, we also show that greedy RLS has a better classification performance on independent test data than a classifier trained using features selected by a statistical p-value-based filter, which is currently the most popular approach for constructing predictive models in GWAS. Conclusions Greedy RLS is the first known implementation of a machine learning based method with the capability to conduct a wrapper-based feature selection on an entire GWAS containing several thousand examples and over 400,000 variants. In our experiments, greedy RLS selected a highly predictive subset of genetic variants in a fraction of the time spent by wrapper-based selection methods used together with SVM classifiers. The proposed algorithms are freely available as part of the RLScore software library at http://users.utu.fi/aatapa/RLScore/. PMID:22551170

  12. The role of green fluorescent protein (GFP) in transgenic plants to reduce gene silencing phenomena.

    PubMed

    El-Shemy, Hany A; Khalafalla, Mutasim M; Ishimoto, Masao

    2009-01-01

    The green fluorescent protein (GFP) of jellyfish (Aequorea victoria) has significant advantages over other reporter genes, because expression can be detected in living cells without any substrates. Recently, epigenetic phenomena are important to consider in plant biotechnology experiments for elucidate unknown mechanism. Therefore, soybean immature cotyledons were generated embryogenesis cells and engineered with two different gene constructs (pHV and pHVS) using gene gun method. Both constructs contain a gene conferring resistance to hygromycin (hpt) as a selective marker and a modified glycinin (11S globulin) gene (V3-1) as a target. However, sGFP(S65T) as a reporter gene was used only in pHVS as a reporter gene for study the relation between using sGFP(S65T) and gene silencing phenomena. Fluorescence microscopic was used for screening after the selection of hygromycin, identified clearly the expression of sGFP(S65T) in the transformed soybean embryos bombarded with the pHVS construct. Protein analysis was used to detect gene expression overall seeds using SDS-PAGE. Percentage of gene down regulation was highly in pHV construct compared with pHVS. Thus, sGFP(S65T ) as a reporter gene in vector system may be play useful role for transgenic evaluation and avoid gene silencing in plants for the benefit of plant transformation system.

  13. Maximum entropy PDF projection: A review

    NASA Astrophysics Data System (ADS)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  14. Investigation of test methods, material properties and processes for solar cell encapsulants

    NASA Technical Reports Server (NTRS)

    Willis, P. B.

    1985-01-01

    The historical development of ethylene vinyl acetate (EVA) is presented, including the functional requirements, polymer selection, curing, stabilization, production and module processing. The construction and use of a new method for the accelerated aging of polymers is detailed. The method more closely resembles the conditions that may be encountered in actual module field exposure and additionally may permit service life to be predicted accurately. The use of hardboard as a low cost candidate substrate material is studied. The performance of surface antisoiling treatments useful for imparting a self cleaning property to modules is updated.

  15. Document co-citation analysis to enhance transdisciplinary research

    PubMed Central

    Trujillo, Caleb M.; Long, Tammy M.

    2018-01-01

    Specialized and emerging fields of research infrequently cross disciplinary boundaries and would benefit from frameworks, methods, and materials informed by other fields. Document co-citation analysis, a method developed by bibliometric research, is demonstrated as a way to help identify key literature for cross-disciplinary ideas. To illustrate the method in a useful context, we mapped peer-recognized scholarship related to systems thinking. In addition, three procedures for validation of co-citation networks are proposed and implemented. This method may be useful for strategically selecting information that can build consilience about ideas and constructs that are relevant across a range of disciplines. PMID:29308433

  16. A novel method for identifying disease associated protein complexes based on functional similarity protein complex networks.

    PubMed

    Le, Duc-Hau

    2015-01-01

    Protein complexes formed by non-covalent interaction among proteins play important roles in cellular functions. Computational and purification methods have been used to identify many protein complexes and their cellular functions. However, their roles in terms of causing disease have not been well discovered yet. There exist only a few studies for the identification of disease-associated protein complexes. However, they mostly utilize complicated heterogeneous networks which are constructed based on an out-of-date database of phenotype similarity network collected from literature. In addition, they only apply for diseases for which tissue-specific data exist. In this study, we propose a method to identify novel disease-protein complex associations. First, we introduce a framework to construct functional similarity protein complex networks where two protein complexes are functionally connected by either shared protein elements, shared annotating GO terms or based on protein interactions between elements in each protein complex. Second, we propose a simple but effective neighborhood-based algorithm, which yields a local similarity measure, to rank disease candidate protein complexes. Comparing the predictive performance of our proposed algorithm with that of two state-of-the-art network propagation algorithms including one we used in our previous study, we found that it performed statistically significantly better than that of these two algorithms for all the constructed functional similarity protein complex networks. In addition, it ran about 32 times faster than these two algorithms. Moreover, our proposed method always achieved high performance in terms of AUC values irrespective of the ways to construct the functional similarity protein complex networks and the used algorithms. The performance of our method was also higher than that reported in some existing methods which were based on complicated heterogeneous networks. Finally, we also tested our method with prostate cancer and selected the top 100 highly ranked candidate protein complexes. Interestingly, 69 of them were evidenced since at least one of their protein elements are known to be associated with prostate cancer. Our proposed method, including the framework to construct functional similarity protein complex networks and the neighborhood-based algorithm on these networks, could be used for identification of novel disease-protein complex associations.

  17. Exploration of complex visual feature spaces for object perception

    PubMed Central

    Leeds, Daniel D.; Pyles, John A.; Tarr, Michael J.

    2014-01-01

    The mid- and high-level visual properties supporting object perception in the ventral visual pathway are poorly understood. In the absence of well-specified theory, many groups have adopted a data-driven approach in which they progressively interrogate neural units to establish each unit's selectivity. Such methods are challenging in that they require search through a wide space of feature models and stimuli using a limited number of samples. To more rapidly identify higher-level features underlying human cortical object perception, we implemented a novel functional magnetic resonance imaging method in which visual stimuli are selected in real-time based on BOLD responses to recently shown stimuli. This work was inspired by earlier primate physiology work, in which neural selectivity for mid-level features in IT was characterized using a simple parametric approach (Hung et al., 2012). To extend such work to human neuroimaging, we used natural and synthetic object stimuli embedded in feature spaces constructed on the basis of the complex visual properties of the objects themselves. During fMRI scanning, we employed a real-time search method to control continuous stimulus selection within each image space. This search was designed to maximize neural responses across a pre-determined 1 cm3 brain region within ventral cortex. To assess the value of this method for understanding object encoding, we examined both the behavior of the method itself and the complex visual properties the method identified as reliably activating selected brain regions. We observed: (1) Regions selective for both holistic and component object features and for a variety of surface properties; (2) Object stimulus pairs near one another in feature space that produce responses at the opposite extremes of the measured activity range. Together, these results suggest that real-time fMRI methods may yield more widely informative measures of selectivity within the broad classes of visual features associated with cortical object representation. PMID:25309408

  18. A novel method for multifactorial bio-chemical experiments design based on combinational design theory.

    PubMed

    Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan

    2017-01-01

    Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.

  19. A rough set approach for determining weights of decision makers in group decision making.

    PubMed

    Yang, Qiang; Du, Ping-An; Wang, Yong; Liang, Bin

    2017-01-01

    This study aims to present a novel approach for determining the weights of decision makers (DMs) based on rough group decision in multiple attribute group decision-making (MAGDM) problems. First, we construct a rough group decision matrix from all DMs' decision matrixes on the basis of rough set theory. After that, we derive a positive ideal solution (PIS) founded on the average matrix of rough group decision, and negative ideal solutions (NISs) founded on the lower and upper limit matrixes of rough group decision. Then, we obtain the weight of each group member and priority order of alternatives by using relative closeness method, which depends on the distances from each individual group member' decision to the PIS and NISs. Through comparisons with existing methods and an on-line business manager selection example, the proposed method show that it can provide more insights into the subjectivity and vagueness of DMs' evaluations and selections.

  20. Catalytic, Enantioselective Sulfenofunctionalisation of Alkenes: Mechanistic, Crystallographic, and Computational Studies

    PubMed Central

    Denmark, Scott E.; Hartmann, Eduard; Kornfilt, David J. P.; Wang, Hao

    2015-01-01

    The stereocontrolled introduction of vicinal heteroatomic substituents into organic molecules is one of the most powerful ways of adding value and function. Whereas many methods exist for the introduction of oxygen- and nitrogen-containing substituents, the number stereocontrolled methods for the introduction of sulfur-containing substituents pales by comparison. Previous reports from these laboratories have described the sulfenofunctionalization of alkenes that construct vicinal carbon-sulfur and carbon-oxygen, carbon-nitrogen as well as carbon-carbon bonds with high levels of diastereospecificity and enantioselectivity. This process is enabled by the concept of Lewis base activation of Lewis acids that provides activation of Group 16 electrophiles. To provide a foundation for expansion of substrate scope and improved selectivities, we have undertaken a comprehensive study of the catalytically active species. Insights gleaned from kinetic, crystallographic and computational methods have led to the introduction of a new family of sulfenylating agents that provide significantly enhanced selectivities. PMID:25411883

  1. Classification of 'Chemlali' accessions according to the geographical area using chemometric methods of phenolic profiles analysed by HPLC-ESI-TOF-MS.

    PubMed

    Taamalli, Amani; Arráez Román, David; Zarrouk, Mokhtar; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2012-05-01

    The present work describes a classification method of Tunisian 'Chemlali' olive oils based on their phenolic composition and geographical area. For this purpose, the data obtained by HPLC-ESI-TOF-MS from 13 samples of extra virgin olive oils, obtained from different production area throughout the country, were used for this study focusing in 23 phenolics compounds detected. The quantitative results showed a significant variability among the analysed oil samples. Factor analysis method using principal component was applied to the data in order to reduce the number of factors which explain the variability of the selected compounds. The data matrix constructed was subjected to a canonical discriminant analysis (CDA) in order to classify the oil samples. These results showed that 100% of cross-validated original group cases were correctly classified, which proves the usefulness of the selected variables. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Turbidity monitoring at select MDOT construction sites.

    DOT National Transportation Integrated Search

    2012-06-01

    The objective of this project was to establish baseline turbidity conditions at select construction : sites by establishing a water quality monitoring program and documenting MDOT approved : BMPs on site. In 2009 the United States Environmental Prote...

  3. Numerical solution of modified differential equations based on symmetry preservation.

    PubMed

    Ozbenli, Ersin; Vedula, Prakash

    2017-12-01

    In this paper, we propose a method to construct invariant finite-difference schemes for solution of partial differential equations (PDEs) via consideration of modified forms of the underlying PDEs. The invariant schemes, which preserve Lie symmetries, are obtained based on the method of equivariant moving frames. While it is often difficult to construct invariant numerical schemes for PDEs due to complicated symmetry groups associated with cumbersome discrete variable transformations, we note that symmetries associated with more convenient transformations can often be obtained by appropriately modifying the original PDEs. In some cases, modifications to the original PDEs are also found to be useful in order to avoid trivial solutions that might arise from particular selections of moving frames. In our proposed method, modified forms of PDEs can be obtained either by addition of perturbation terms to the original PDEs or through defect correction procedures. These additional terms, whose primary purpose is to enable symmetries with more convenient transformations, are then removed from the system by considering moving frames for which these specific terms go to zero. Further, we explore selection of appropriate moving frames that result in improvement in accuracy of invariant numerical schemes based on modified PDEs. The proposed method is tested using the linear advection equation (in one- and two-dimensions) and the inviscid Burgers' equation. Results obtained for these tests cases indicate that numerical schemes derived from the proposed method perform significantly better than existing schemes not only by virtue of improvement in numerical accuracy but also due to preservation of qualitative properties or symmetries of the underlying differential equations.

  4. Entropy-based gene ranking without selection bias for the predictive classification of microarray data.

    PubMed

    Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe

    2003-11-06

    We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  5. From standard alpha-stable Lévy motions to horizontal visibility networks: dependence of multifractal and Laplacian spectrum

    NASA Astrophysics Data System (ADS)

    Zou, Hai-Long; Yu, Zu-Guo; Anh, Vo; Ma, Yuan-Lin

    2018-05-01

    In recent years, researchers have proposed several methods to transform time series (such as those of fractional Brownian motion) into complex networks. In this paper, we construct horizontal visibility networks (HVNs) based on the -stable Lévy motion. We aim to study the relations of multifractal and Laplacian spectrum of transformed networks on the parameters and of the -stable Lévy motion. First, we employ the sandbox algorithm to compute the mass exponents and multifractal spectrum to investigate the multifractality of these HVNs. Then we perform least squares fits to find possible relations of the average fractal dimension , the average information dimension and the average correlation dimension against using several methods of model selection. We also investigate possible dependence relations of eigenvalues and energy on , calculated from the Laplacian and normalized Laplacian operators of the constructed HVNs. All of these constructions and estimates will help us to evaluate the validity and usefulness of the mappings between time series and networks, especially between time series of -stable Lévy motions and HVNs.

  6. 'I still believe...' Reconstructing spirituality, culture and mental health across cultural divides.

    PubMed

    Mayer, Claude-Hélène; Viviers, Rian

    2014-06-01

    Whilst striving to create a balanced and healthy life, individuals experience challenges across their life span. Spirituality can contribute to mental health and well-being, as can cultural constructs. In South Africa, apartheid categories are still vivid, which affect spiritual, cultural and racial mental constructs and impact on the mental health of individuals across cultural groups. This article focuses on the long-term development of spiritual and cultural concepts within a selected individual in Cape Town, South Africa, during 11 years of field work. It also explores the impact of spirituality and culture on the researcher-researched relationship. A mixed-method approach was used, including various qualitative methods of data collection as well as content analysis to analyse the data and intersubjective validation to interpret them. Findings show a strong intrapersonal interlinkage of spirituality, culture and mental health and the researcher-researched relationship having a strong impact on spiritual, cultural and mental health constructions. We are not human beings having a spiritual experience. We are spiritual beings having a human experience. (Pierre Teilhard de Chardin, 1976).

  7. Human niche construction in interdisciplinary focus

    PubMed Central

    Kendal, Jeremy; Tehrani, Jamshid J.; Odling-Smee, John

    2011-01-01

    Niche construction is an endogenous causal process in evolution, reciprocal to the causal process of natural selection. It works by adding ecological inheritance, comprising the inheritance of natural selection pressures previously modified by niche construction, to genetic inheritance in evolution. Human niche construction modifies selection pressures in environments in ways that affect both human evolution, and the evolution of other species. Human ecological inheritance is exceptionally potent because it includes the social transmission and inheritance of cultural knowledge, and material culture. Human genetic inheritance in combination with human cultural inheritance thus provides a basis for gene–culture coevolution, and multivariate dynamics in cultural evolution. Niche construction theory potentially integrates the biological and social aspects of the human sciences. We elaborate on these processes, and provide brief introductions to each of the papers published in this theme issue. PMID:21320894

  8. Construction of CRISPR Libraries for Functional Screening.

    PubMed

    Carstens, Carsten P; Felts, Katherine A; Johns, Sarah E

    2018-01-01

    Identification of gene function has been aided by the ability to generate targeted gene knockouts or transcriptional repression using the CRISPR/CAS9 system. Using pooled libraries of guide RNA expression vectors that direct CAS9 to a specific genomic site allows identification of genes that are either enriched or depleted in response to a selection scheme, thus linking the affected gene to the chosen phenotype. The quality of the data generated by the screening is dependent on the quality of the guide RNA delivery library with regards to error rates and especially evenness of distribution of the guides. Here, we describe a method for constructing complex plasmid libraries based on pooled designed oligomers with high representation and tight distributions. The procedure allows construction of plasmid libraries of >60,000 members with a 95th/5th percentile ratio of less than 3.5.

  9. [Investigation on pattern and methods of quality control for Chinese materia medica based on dao-di herbs and bioassay - bioassay for Coptis chinensis].

    PubMed

    Yan, Dan; Xiao, Xiao-he

    2011-05-01

    Establishment of bioassay methods is the technical issues to be faced with in the bioassay of Chinese materia medica. Taking the bioassay of Coptis chinensis Franch. as an example, the establishment process and application of the bioassay methods (including bio-potency and bio-activity fingerprint) were explained from the aspects of methodology, principle of selection, experimental design, method confirmation and data analysis. The common technologies were extracted and formed with the above aspects, so as to provide technical support for constructing pattern and method of the quality control for Chinese materia medica based on the dao-di herbs and bioassay.

  10. Efficient robust doubly adaptive regularized regression with applications.

    PubMed

    Karunamuni, Rohana J; Kong, Linglong; Tu, Wei

    2018-01-01

    We consider the problem of estimation and variable selection for general linear regression models. Regularized regression procedures have been widely used for variable selection, but most existing methods perform poorly in the presence of outliers. We construct a new penalized procedure that simultaneously attains full efficiency and maximum robustness. Furthermore, the proposed procedure satisfies the oracle properties. The new procedure is designed to achieve sparse and robust solutions by imposing adaptive weights on both the decision loss and the penalty function. The proposed method of estimation and variable selection attains full efficiency when the model is correct and, at the same time, achieves maximum robustness when outliers are present. We examine the robustness properties using the finite-sample breakdown point and an influence function. We show that the proposed estimator attains the maximum breakdown point. Furthermore, there is no loss in efficiency when there are no outliers or the error distribution is normal. For practical implementation of the proposed method, we present a computational algorithm. We examine the finite-sample and robustness properties using Monte Carlo studies. Two datasets are also analyzed.

  11. A rapid, generally applicable method to engineer zinc fingers illustrated by targeting the HIV-1 promoter.

    PubMed

    Isalan, M; Klug, A; Choo, Y

    2001-07-01

    DNA-binding domains with predetermined sequence specificity are engineered by selection of zinc finger modules using phage display, allowing the construction of customized transcription factors. Despite remarkable progress in this field, the available protein-engineering methods are deficient in many respects, thus hampering the applicability of the technique. Here we present a rapid and convenient method that can be used to design zinc finger proteins against a variety of DNA-binding sites. This is based on a pair of pre-made zinc finger phage-display libraries, which are used in parallel to select two DNA-binding domains each of which recognizes given 5 base pair sequences, and whose products are recombined to produce a single protein that recognizes a composite (9 base pair) site of predefined sequence. Engineering using this system can be completed in less than two weeks and yields proteins that bind sequence-specifically to DNA with Kd values in the nanomolar range. To illustrate the technique, we have selected seven different proteins to bind various regions of the human immunodeficiency virus 1 (HIV-1) promoter.

  12. Nest construction by a ground-nesting bird represents a potential trade-off between egg crypticity and thermoregulation.

    PubMed

    Mayer, Paul M; Smith, Levica M; Ford, Robert G; Watterson, Dustin C; McCutchen, Marshall D; Ryan, Mark R

    2009-04-01

    Predation selects against conspicuous colors in bird eggs and nests, while thermoregulatory constraints select for nest-building behavior that regulates incubation temperatures. We present results that suggest a trade-off between nest crypticity and thermoregulation of eggs based on selection of nest materials by piping plovers (Charadrius melodus), a ground-nesting bird that constructs simple, pebble-lined nests highly vulnerable to predators and exposed to temperature extremes. Piping plovers selected pebbles that were whiter and appeared closer in color to eggs than randomly available pebbles, suggesting a crypsis function. However, nests that were more contrasting in color to surrounding substrates were at greater risk of predation, suggesting an alternate strategy driving selection of white rocks. Near-infrared reflectance of nest pebbles was higher than randomly available pebbles, indicating a direct physical mechanism for heat control through pebble selection. Artificial nests constructed of randomly available pebbles heated more quickly and conferred heat to model eggs, causing eggs to heat more rapidly than in nests constructed from piping plover nest pebbles. Thermal models and field data indicated that temperatures inside nests may remain up to 2-6 degrees C cooler than surrounding substrates. Thermal models indicated that nests heat especially rapidly if not incubated, suggesting that nest construction behavior may serve to keep eggs cooler during the unattended laying period. Thus, pebble selection suggests a potential trade-off between maximizing heat reflectance to improve egg microclimate and minimizing conspicuous contrast of nests with the surrounding substrate to conceal eggs from predators. Nest construction behavior that employs light-colored, thermally reflective materials may represent an evolutionary response by birds and other egg-laying organisms to egg predation and heat stress.

  13. Materials and methods for efficient lactic acid production

    DOEpatents

    Zhou, Shengde; Ingram, Lonnie O& #x27; Neal; Shanmugam, Keelnatham T; Yomano, Lorraine; Grabar, Tammy B; Moore, Jonathan C

    2013-04-23

    The present invention provides derivatives of Escherichia coli constructed for the production of lactic acid. The transformed E. coli of the invention are prepared by deleting the genes that encode competing pathways followed by a growth-based selection for mutants with improved performance. These transformed E. coli are useful for providing an increased supply of lactic acid for use in food and industrial applications.

  14. Materials and methods for efficient lactic acid production

    DOEpatents

    Zhou, Shengde [Sycamore, IL; Ingram, Lonnie O'Neal [Gainesville, FL; Shanmugam, Keelnatham T [Gainesville, FL; Yomano, Lorraine [Gainesville, FL; Grabar, Tammy B [Gainesville, FL; Moore, Jonathan C [Gainesville, FL

    2009-12-08

    The present invention provides derivatives of ethanologenic Escherichia coli K011 constructed for the production of lactic acid. The transformed E. coli of the invention are prepared by deleting the genes that encode competing pathways followed by a growth-based selection for mutants with improved performance. These transformed E. coli are useful for providing an increased supply of lactic acid for use in food and industrial applications.

  15. Decision Support Model for Mosque Renovation and Rehabilitation (Case Study: Ten Mosques in Jakarta Barat, Indonesia)

    NASA Astrophysics Data System (ADS)

    Utama, D. N.; Triana, Y. S.; Iqbal, M. M.; Iksal, M.; Fikri, I.; Dharmawan, T.

    2018-03-01

    Mosque, for Muslim, is not only a place for daily worshipping, however as a center of culture as well. It is an important and valuable building to be well managed. For a responsible department or institution (such as Religion or Plan Department in Indonesia), to practically manage a lot of mosques is not simple task to handle. The challenge is in relation to data number and characteristic problems tackled. Specifically for renovating and rehabilitating the damaged mosques, a decision to determine the first damaged mosque priority to be renovated and rehabilitated is problematic. Through two types of optimization method, simulated-annealing and hill-climbing, a decision support model for mosque renovation and rehabilitation was systematically constructed. The method fuzzy-logic was also operated to establish the priority of eleven selected parameters. The constructed model is able to simulate an efficiency comparison between two optimization methods used and suggest the most objective decision coming from 196 generated alternatives.

  16. [Assessment on the ecological suitability in Zhuhai City, Guangdong, China, based on minimum cumulative resistance model].

    PubMed

    Li, Jian-fei; Li, Lin; Guo, Luo; Du, Shi-hong

    2016-01-01

    Urban landscape has the characteristics of spatial heterogeneity. Because the expansion process of urban constructive or ecological land has different resistance values, the land unit stimulates and promotes the expansion of ecological land with different intensity. To compare the effect of promoting and hindering functions in the same land unit, we firstly compared the minimum cumulative resistance value of promoting and hindering functions, and then looked for the balance of two landscape processes under the same standard. According to the ecology principle of minimum limit factor, taking the minimum cumulative resistance analysis method under two expansion processes as the evaluation method of urban land ecological suitability, this research took Zhuhai City as the study area to estimate urban ecological suitability by relative evaluation method with remote sensing image, field survey, and statistics data. With the support of ArcGIS, five types of indicators on landscape types, ecological value, soil erosion sensitivity, sensitivity of geological disasters, and ecological function were selected as input parameters in the minimum cumulative resistance model to compute urban ecological suitability. The results showed that the ecological suitability of the whole Zhuhai City was divided into five levels: constructive expansion prohibited zone (10.1%), constructive expansion restricted zone (32.9%), key construction zone (36.3%), priority development zone (2.3%), and basic cropland (18.4%). Ecological suitability of the central area of Zhuhai City was divided into four levels: constructive expansion prohibited zone (11.6%), constructive expansion restricted zone (25.6%), key construction zone (52.4%), priority development zone (10.4%). Finally, we put forward the sustainable development framework of Zhuhai City according to the research conclusion. On one hand, the government should strictly control the development of the urban center area. On the other hand, the secondary urban center area such as Junchang and Doumen need improve the public infrastructure to relieve the imbalance between eastern and western development in Zhuhai City.

  17. RESIDENTIAL RADON RESISTANT CONSTRUCTION FEATURE SELECTION SYSTEM

    EPA Science Inventory

    The report describes a proposed residential radon resistant construction feature selection system. The features consist of engineered barriers to reduce radon entry and accumulation indoors. The proposed Florida standards require radon resistant features in proportion to regional...

  18. A Java-based tool for the design of classification microarrays.

    PubMed

    Meng, Da; Broschat, Shira L; Call, Douglas R

    2008-08-04

    Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for analysis of subsequent experimental data. Additionally, PLASMID can be used to construct virtual microarrays with genomes from public databases, which can then be used to identify an optimal set of probes.

  19. The Construct Validity of HPAT-Ireland for the Selection of Medical Students: Unresolved Issues and Future Research Implications

    ERIC Educational Resources Information Center

    Kelly, Maureen E.; O'Flynn, Siun

    2017-01-01

    Aptitude tests are widely used in selection. However, despite certain advantages their use remains controversial. This paper aims to critically appraise five sources of evidence for the construct validity of the Health Professions Admission Test (HPAT)-Ireland, an aptitude test used for selecting undergraduate medical students. The objectives are…

  20. Isolation and characterization of novel microsatellite markers from the sika deer (Cervus nippon) genome.

    PubMed

    Li, Y M; Bai, C Y; Niu, W P; Yu, H; Yang, R J; Yan, S Q; Zhang, J Y; Zhang, M J; Zhao, Z H

    2015-09-28

    Microsatellite markers are widely and evenly distributed, and are highly polymorphic. Rapid and convenient detection through automated analysis means that microsatellite markers are widely used in the construction of plant and animal genetic maps, in quantitative trait loci localization, marker-assisted selection, identification of genetic relationships, and genetic diversity and phylogenetic tree construction. However, few microsatellite markers remain to be isolated. We used streptavidin magnetic beads to affinity-capture and construct a (CA)n microsatellite DNA-enriched library from sika deer. We selected sequences containing more than six repeats to design primers. Clear bands were selected, which were amplified using non-specific primers following PCR amplification to screen polymorphisms in a group of 65 unrelated sika deer. The positive clone rate reached 82.9% by constructing the enriched library, and we then selected positive clones for sequencing. There were 395 sequences with CA repeats, and the CA repeat number was 4-105. We selected sequences containing more than six repeats to design primers, of which 297 pairs were designed. We next selected clear bands and used non-specific primers to amplify following PCR amplification. In total, 245 pairs of primers were screened. We then selected 50 pairs of primers to randomly screen for polymorphisms. We detected 47 polymorphic and 3 monomorphic loci in 65 unrelated sika deer. These newly isolated and characterized microsatellite loci can be used to construct genetic maps and for lineage testing in deer. In addition, they can be used for comparative genomics between Cervidae species.

  1. A highly oriented hybrid microarray modified electrode fabricated by a template-free method for ultrasensitive electrochemical DNA recognition

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Chu, Zhenyu; Dong, Xueliang; Jin, Wanqin; Dempsey, Eithne

    2013-10-01

    Highly oriented growth of a hybrid microarray was realized by a facile template-free method on gold substrates for the first time. The proposed formation mechanism involves an interfacial structure-directing force arising from self-assembled monolayers (SAMs) between gold substrates and hybrid crystals. Different SAMs and variable surface coverage of the assembled molecules play a critical role in the interfacial directing forces and influence the morphologies of hybrid films. A highly oriented hybrid microarray was formed on the highly aligned and vertical SAMs of 1,4-benzenedithiol molecules with rigid backbones, which afforded an intense structure-directing power for the oriented growth of hybrid crystals. Additionally, the density of the microarray could be adjusted by controlling the surface coverage of assembled molecules. Based on the hybrid microarray modified electrode with a large specific area (ca. 10 times its geometrical area), a label-free electrochemical DNA biosensor was constructed for the detection of an oligonucleotide fragment of the avian flu virus H5N1. The DNA biosensor displayed a significantly low detection limit of 5 pM (S/N = 3), a wide linear response from 10 pM to 10 nM, as well as excellent selectivity, good regeneration and high stability. We expect that the proposed template-free method can provide a new reference for the fabrication of a highly oriented hybrid array and the as-prepared microarray modified electrode will be a promising paradigm in constructing highly sensitive and selective biosensors.Highly oriented growth of a hybrid microarray was realized by a facile template-free method on gold substrates for the first time. The proposed formation mechanism involves an interfacial structure-directing force arising from self-assembled monolayers (SAMs) between gold substrates and hybrid crystals. Different SAMs and variable surface coverage of the assembled molecules play a critical role in the interfacial directing forces and influence the morphologies of hybrid films. A highly oriented hybrid microarray was formed on the highly aligned and vertical SAMs of 1,4-benzenedithiol molecules with rigid backbones, which afforded an intense structure-directing power for the oriented growth of hybrid crystals. Additionally, the density of the microarray could be adjusted by controlling the surface coverage of assembled molecules. Based on the hybrid microarray modified electrode with a large specific area (ca. 10 times its geometrical area), a label-free electrochemical DNA biosensor was constructed for the detection of an oligonucleotide fragment of the avian flu virus H5N1. The DNA biosensor displayed a significantly low detection limit of 5 pM (S/N = 3), a wide linear response from 10 pM to 10 nM, as well as excellent selectivity, good regeneration and high stability. We expect that the proposed template-free method can provide a new reference for the fabrication of a highly oriented hybrid array and the as-prepared microarray modified electrode will be a promising paradigm in constructing highly sensitive and selective biosensors. Electronic supplementary information (ESI) available: Four-probe method for determining the conductivity of the hybrid crystal (Fig. S1); stability comparisons of the hybrid films (Fig. S2); FESEM images of the hybrid microarray (Fig. S3); electrochemical characterizations of the hybrid films (Fig. S4); DFT simulations (Fig. S5); cross-sectional FESEM image of the hybrid microarray (Fig. S6); regeneration and stability tests of the DNA biosensor (Fig. S7). See DOI: 10.1039/c3nr03097k

  2. Participatory Training to Improve Safety and Health in Small Construction Sites in Some Countries in Asia: Development and Application of the WISCON Training Program.

    PubMed

    Kawakami, Tsuyoshi

    2016-08-01

    A participatory training program, Work Improvement in Small Construction Sites, was developed to provide practical support measures to the small construction sector. Managers and workers from selected small sites were interviewed about their occupational safety and health risks. The Work Improvement in Small Construction Sites training program comprised a 45-item action checklist, photos, and illustrations showing local examples and group work methods. Pilot training workshops were carried out with workers and employers in Cambodia, Laos, Mongolia, Thailand, and Vietnam. Participants subsequently planned, and using locally available low-cost materials, implemented their own improvements such as hand-made hand trucks to carry heavy materials, removal of projecting nails from timber materials, and fences to protect roof workers from falling. Local Work Improvement in Small Construction Sites trainers consisting of government officials, workers, employers, and nongovernment organization representatives were then trained to implement the Work Improvement in Small Construction Sites training widely. Keys to success were easy-to-apply training tools aiming at immediate, low-cost improvements, and collaboration with various local people's networks. © The Author(s) 2016.

  3. A Multifeatures Fusion and Discrete Firefly Optimization Method for Prediction of Protein Tyrosine Sulfation Residues.

    PubMed

    Guo, Song; Liu, Chunhua; Zhou, Peng; Li, Yanling

    2016-01-01

    Tyrosine sulfation is one of the ubiquitous protein posttranslational modifications, where some sulfate groups are added to the tyrosine residues. It plays significant roles in various physiological processes in eukaryotic cells. To explore the molecular mechanism of tyrosine sulfation, one of the prerequisites is to correctly identify possible protein tyrosine sulfation residues. In this paper, a novel method was presented to predict protein tyrosine sulfation residues from primary sequences. By means of informative feature construction and elaborate feature selection and parameter optimization scheme, the proposed predictor achieved promising results and outperformed many other state-of-the-art predictors. Using the optimal features subset, the proposed method achieved mean MCC of 94.41% on the benchmark dataset, and a MCC of 90.09% on the independent dataset. The experimental performance indicated that our new proposed method could be effective in identifying the important protein posttranslational modifications and the feature selection scheme would be powerful in protein functional residues prediction research fields.

  4. A Multifeatures Fusion and Discrete Firefly Optimization Method for Prediction of Protein Tyrosine Sulfation Residues

    PubMed Central

    Liu, Chunhua; Zhou, Peng; Li, Yanling

    2016-01-01

    Tyrosine sulfation is one of the ubiquitous protein posttranslational modifications, where some sulfate groups are added to the tyrosine residues. It plays significant roles in various physiological processes in eukaryotic cells. To explore the molecular mechanism of tyrosine sulfation, one of the prerequisites is to correctly identify possible protein tyrosine sulfation residues. In this paper, a novel method was presented to predict protein tyrosine sulfation residues from primary sequences. By means of informative feature construction and elaborate feature selection and parameter optimization scheme, the proposed predictor achieved promising results and outperformed many other state-of-the-art predictors. Using the optimal features subset, the proposed method achieved mean MCC of 94.41% on the benchmark dataset, and a MCC of 90.09% on the independent dataset. The experimental performance indicated that our new proposed method could be effective in identifying the important protein posttranslational modifications and the feature selection scheme would be powerful in protein functional residues prediction research fields. PMID:27034949

  5. Determination of propranolol hydrochloride in pharmaceutical preparations using near infrared spectrometry with fiber optic probe and multivariate calibration methods.

    PubMed

    Marques Junior, Jucelino Medeiros; Muller, Aline Lima Hermes; Foletto, Edson Luiz; da Costa, Adilson Ben; Bizzi, Cezar Augusto; Irineu Muller, Edson

    2015-01-01

    A method for determination of propranolol hydrochloride in pharmaceutical preparation using near infrared spectrometry with fiber optic probe (FTNIR/PROBE) and combined with chemometric methods was developed. Calibration models were developed using two variable selection models: interval partial least squares (iPLS) and synergy interval partial least squares (siPLS). The treatments based on the mean centered data and multiplicative scatter correction (MSC) were selected for models construction. A root mean square error of prediction (RMSEP) of 8.2 mg g(-1) was achieved using siPLS (s2i20PLS) algorithm with spectra divided into 20 intervals and combination of 2 intervals (8501 to 8801 and 5201 to 5501 cm(-1)). Results obtained by the proposed method were compared with those using the pharmacopoeia reference method and significant difference was not observed. Therefore, proposed method allowed a fast, precise, and accurate determination of propranolol hydrochloride in pharmaceutical preparations. Furthermore, it is possible to carry out on-line analysis of this active principle in pharmaceutical formulations with use of fiber optic probe.

  6. Correlates of fruit and vegetable consumption among construction laborers and motor freight workers

    PubMed Central

    Nagler, Eve M.; Viswanath, K.; Ebbeling, Cara B.; Stoddard, Anne M.; Sorensen, Glorian C.

    2013-01-01

    Purpose To compare and contrast correlates of fruit and vegetable consumption in two blue-collar populations: construction laborers and motor freight workers. Methods Cross-sectional data were collected from two groups of male workers: (1) construction laborers (N=1013; response rate = 44%) randomly selected from a national sample, as part of a diet and smoking cessation study; and (2) motor freight workers (N=542; response rate = 78%) employed in eight trucking terminals, as part of a tobacco cessation and weight management study. Data were analyzed using linear regression modeling methods. Results For both groups, higher income and believing it was important to eat right because of work were positively associated with fruit and vegetable consumption; conversely, being White was associated with lower intake. Construction laborers who reported eating junk food due to workplace stress and fatigue had lower fruit and vegetable intake. For motor freight workers, perceiving fast food to be the only choice at work and lack of time to eat right were associated with lower consumption. Conclusion Comparing occupational groups illustrates how work experiences may be related to fruit and vegetable consumption in different ways as well as facilitates the development of interventions that can be used across groups. PMID:22729935

  7. Direct phase selection of initial phases from single-wavelength anomalous dispersion (SAD) for the improvement of electron density and ab initio structure determination.

    PubMed

    Chen, Chung-De; Huang, Yen-Chieh; Chiang, Hsin-Lin; Hsieh, Yin-Cheng; Guan, Hong-Hsiang; Chuankhayan, Phimonphan; Chen, Chun-Jung

    2014-09-01

    Optimization of the initial phasing has been a decisive factor in the success of the subsequent electron-density modification, model building and structure determination of biological macromolecules using the single-wavelength anomalous dispersion (SAD) method. Two possible phase solutions (φ1 and φ2) generated from two symmetric phase triangles in the Harker construction for the SAD method cause the well known phase ambiguity. A novel direct phase-selection method utilizing the θ(DS) list as a criterion to select optimized phases φ(am) from φ1 or φ2 of a subset of reflections with a high percentage of correct phases to replace the corresponding initial SAD phases φ(SAD) has been developed. Based on this work, reflections with an angle θ(DS) in the range 35-145° are selected for an optimized improvement, where θ(DS) is the angle between the initial phase φ(SAD) and a preliminary density-modification (DM) phase φ(DM)(NHL). The results show that utilizing the additional direct phase-selection step prior to simple solvent flattening without phase combination using existing DM programs, such as RESOLVE or DM from CCP4, significantly improves the final phases in terms of increased correlation coefficients of electron-density maps and diminished mean phase errors. With the improved phases and density maps from the direct phase-selection method, the completeness of residues of protein molecules built with main chains and side chains is enhanced for efficient structure determination.

  8. THE NICHE CONSTRUCTION PERSPECTIVE: A CRITICAL APPRAISAL*

    PubMed Central

    Scott-Phillips, Thomas C; Laland, Kevin N; Shuker, David M; Dickins, Thomas E; West, Stuart A

    2014-01-01

    Niche construction refers to the activities of organisms that bring about changes in their environments, many of which are evolutionarily and ecologically consequential. Advocates of niche construction theory (NCT) believe that standard evolutionary theory fails to recognize the full importance of niche construction, and consequently propose a novel view of evolution, in which niche construction and its legacy over time (ecological inheritance) are described as evolutionary processes, equivalent in importance to natural selection. Here, we subject NCT to critical evaluation, in the form of a collaboration between one prominent advocate of NCT, and a team of skeptics. We discuss whether niche construction is an evolutionary process, whether NCT obscures or clarifies how natural selection leads to organismal adaptation, and whether niche construction and natural selection are of equivalent explanatory importance. We also consider whether the literature that promotes NCT overstates the significance of niche construction, whether it is internally coherent, and whether it accurately portrays standard evolutionary theory. Our disagreements reflect a wider dispute within evolutionary theory over whether the neo-Darwinian synthesis is in need of reformulation, as well as different usages of some key terms (e.g., evolutionary process). PMID:24325256

  9. The construction of bilingual teaching of optoelectronic technology

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Zhao, Enming; Yang, Fan; Li, Qingbo; Zhu, Zheng; Li, Cheng; Sun, Peng

    2017-08-01

    This paper combines the characteristics of optoelectronic technology with that of bilingual teaching. The course pays attention to integrating theory with practice, and cultivating learners' ability. Reform and exploration have been done in the fields of teaching materials, teaching content, teaching methods, etc. The concrete content mainly includes five parts: selecting teaching materials, establishing teaching syllabus, choosing suitable teaching method, making multimedia courseware and improving the test system, which can arouse students' interest in their study and their autonomous learning ability to provide beneficial references for improving the quality of talents of optoelectronic bilingual courses.

  10. Implementation of MCA Method for Identification of Factors for Conceptual Cost Estimation of Residential Buildings

    NASA Astrophysics Data System (ADS)

    Juszczyk, Michał; Leśniak, Agnieszka; Zima, Krzysztof

    2013-06-01

    Conceptual cost estimation is important for construction projects. Either underestimation or overestimation of building raising cost may lead to failure of a project. In the paper authors present application of a multicriteria comparative analysis (MCA) in order to select factors influencing residential building raising cost. The aim of the analysis is to indicate key factors useful in conceptual cost estimation in the early design stage. Key factors are being investigated on basis of the elementary information about the function, form and structure of the building, and primary assumptions of technological and organizational solutions applied in construction process. The mentioned factors are considered as variables of the model which aim is to make possible conceptual cost estimation fast and with satisfying accuracy. The whole analysis included three steps: preliminary research, choice of a set of potential variables and reduction of this set to select the final set of variables. Multicriteria comparative analysis is applied in problem solution. Performed analysis allowed to select group of factors, defined well enough at the conceptual stage of the design process, to be used as a describing variables of the model.

  11. Construction and performance characteristics of new ion selective electrodes based on carbon nanotubes for determination of meclofenoxate hydrochloride.

    PubMed

    El-Nashar, Rasha M; Abdel Ghani, Nour T; Hassan, Sherif M

    2012-06-12

    This work offers construction and comparative evaluation the performance characteristics of conventional polymer (I), carbon paste (II) and carbon nanotubes chemically modified carbon paste ion selective electrodes (III) for meclofenoxate hydrochloride are described. These electrodes depend mainly on the incorporation of the ion pair of meclofenoxate hydrochloride with phosphomolybdic acid (PMA) or phosphotungestic acid (PTA). They showed near Nernestian responses over usable concentration range 1.0 × 10(-5) to 1.0 × 10(-2)M with slopes in the range 55.15-59.74 mV(concentrationdecade)(-1). These developed electrodes were fully characterized in terms of their composition, response time, working concentration range, life span, usable pH and temperature range. The electrodes showed a very good selectivity for Meclo with respect to a large number of inorganic cations, sugars and in the presence of the degradation product of the drug (p-chloro phenoxy acetic acid). The standard additions method was applied to the determination of MecloCl in pure solution, pharmaceutical preparations and biological samples. Dissolution testing was also applied using the proposed sensors. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Raman Spectrometer for Surface Identification of Minerals and Organic Compounds on Silicate Planets and Small Solar-System Bodies

    NASA Technical Reports Server (NTRS)

    Haskin, Larry A.

    2000-01-01

    This summary is the final report of work on two-year grant. Our objectives for this project were (1) to demonstrate that Raman spectroscopy is an excellent method for determining mineralogy on the surface of the Moon, Mars, and other planetary bodies; (2) to construct a prototype of a small Raman spectrometer of the kind we suggest could be used on a lander or rover; and (3) to test the ability of that spectrometer to identify minerals and quantify mineral proportions in lunar materials and complex Martian analog materials, and to identify organic matter in planetary surface materials, all under roughly simulated field conditions. These goals have been met. The principal accomplishments of this PIDDP project have been the following: selection for flight; construction of a breadboard Raman probe; throughput confirmation of the breadboard Raman probe; selection of a laser; a breadboard spectrograph based on our PIDDP design; and overall result.

  13. [Peptide phage display in biotechnology and biomedicine].

    PubMed

    Kuzmicheva, G A; Belyavskaya, V A

    2016-07-01

    To date peptide phage display is one of the most common combinatorial methods used for identifying specific peptide ligands. Phage display peptide libraries containing billions different clones successfully used for selection of ligands with high affinity and selectivity toward wide range of targets including individual proteins, bacteria, viruses, spores, different kind of cancer cells and variety of nonorganic targets (metals, alloys, semiconductors etc.) Success of using filamentous phage in phage display technologies relays on the robustness of phage particles and a possibility to genetically modify its DNA to construct new phage variants with novel properties. In this review we are discussing characteristics of the most known non-commercial peptide phage display libraries of different formats (landscape libraries in particular) and their successful applications in several fields of biotechnology and biomedicine: discovery of peptides with diagnostic values against different pathogens, discovery and using of peptides recognizing cancer cells, trends in using of phage display technologies in human interactome studies, application of phage display technologies in construction of novel nano materials.

  14. Development of a statistical oil spill model for risk assessment.

    PubMed

    Guo, Weijun

    2017-11-01

    To gain a better understanding of the impacts from potential risk sources, we developed an oil spill model using probabilistic method, which simulates numerous oil spill trajectories under varying environmental conditions. The statistical results were quantified from hypothetical oil spills under multiple scenarios, including area affected probability, mean oil slick thickness, and duration of water surface exposed to floating oil. The three sub-indices together with marine area vulnerability are merged to compute the composite index, characterizing the spatial distribution of risk degree. Integral of the index can be used to identify the overall risk from an emission source. The developed model has been successfully applied in comparison to and selection of an appropriate oil port construction location adjacent to a marine protected area for Phoca largha in China. The results highlight the importance of selection of candidates before project construction, since that risk estimation from two adjacent potential sources may turn out to be significantly different regarding hydrodynamic conditions and eco-environmental sensitivity. Copyright © 2017. Published by Elsevier Ltd.

  15. Models of Cultural Niche Construction with Selection and Assortative Mating

    PubMed Central

    Feldman, Marcus W.

    2012-01-01

    Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits. PMID:22905167

  16. Adhesion switch on a gecko-foot inspired smart nanocupule surface

    NASA Astrophysics Data System (ADS)

    Song, Wenlong

    2014-10-01

    A gecko-foot inspired nanocupule surface prepared by an AAO template covering method was composed of poly(N-isopropylacrylamide) and polystyrene blend. Both superhydrophobicity and high adhesion force were exhibited on the PNIPAm/PS film at room temperature. Moreover, by controlling the temperature, the wettability of the film could be switched between 138.1 +/- 5.5° and 150.6 +/- 1.5°, and the adhesion force could also be correspondingly tuned accurately by temperature. This reversibility in both wettability and adhesion force could be used to construct smart devices for fine selection of water droplets. The proof-of-concept was demonstrated by the selective catching of precise weight controlled water droplets at different temperatures. This work could help us to design new type of devices for blood bioanalysis or lossless drug transportation.A gecko-foot inspired nanocupule surface prepared by an AAO template covering method was composed of poly(N-isopropylacrylamide) and polystyrene blend. Both superhydrophobicity and high adhesion force were exhibited on the PNIPAm/PS film at room temperature. Moreover, by controlling the temperature, the wettability of the film could be switched between 138.1 +/- 5.5° and 150.6 +/- 1.5°, and the adhesion force could also be correspondingly tuned accurately by temperature. This reversibility in both wettability and adhesion force could be used to construct smart devices for fine selection of water droplets. The proof-of-concept was demonstrated by the selective catching of precise weight controlled water droplets at different temperatures. This work could help us to design new type of devices for blood bioanalysis or lossless drug transportation. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr04090b

  17. Improved design of hammerhead ribozyme for selective digestion of target RNA through recognition of site-specific adenosine-to-inosine RNA editing

    PubMed Central

    Fukuda, Masatora; Kurihara, Kei; Yamaguchi, Shota; Oyama, Yui; Deshimaru, Masanobu

    2014-01-01

    Adenosine-to-inosine (A-to-I) RNA editing is an endogenous regulatory mechanism involved in various biological processes. Site-specific, editing-state–dependent degradation of target RNA may be a powerful tool both for analyzing the mechanism of RNA editing and for regulating biological processes. Previously, we designed an artificial hammerhead ribozyme (HHR) for selective, site-specific RNA cleavage dependent on the A-to-I RNA editing state. In the present work, we developed an improved strategy for constructing a trans-acting HHR that specifically cleaves target editing sites in the adenosine but not the inosine state. Specificity for unedited sites was achieved by utilizing a sequence encoding the intrinsic cleavage specificity of a natural HHR. We used in vitro selection methods in an HHR library to select for an extended HHR containing a tertiary stabilization motif that facilitates HHR folding into an active conformation. By using this method, we successfully constructed highly active HHRs with unedited-specific cleavage. Moreover, using HHR cleavage followed by direct sequencing, we demonstrated that this ribozyme could cleave serotonin 2C receptor (HTR2C) mRNA extracted from mouse brain, depending on the site-specific editing state. This unedited-specific cleavage also enabled us to analyze the effect of editing state at the E and C sites on editing at other sites by using direct sequencing for the simultaneous quantification of the editing ratio at multiple sites. Our approach has the potential to elucidate the mechanism underlying the interdependencies of different editing states in substrate RNA with multiple editing sites. PMID:24448449

  18. Study on Big Database Construction and its Application of Sample Data Collected in CHINA'S First National Geographic Conditions Census Based on Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Cheng, T.; Zhou, X.; Jia, Y.; Yang, G.; Bai, J.

    2018-04-01

    In the project of China's First National Geographic Conditions Census, millions of sample data have been collected all over the country for interpreting land cover based on remote sensing images, the quantity of data files reaches more than 12,000,000 and has grown in the following project of National Geographic Conditions Monitoring. By now, using database such as Oracle for storing the big data is the most effective method. However, applicable method is more significant for sample data's management and application. This paper studies a database construction method which is based on relational database with distributed file system. The vector data and file data are saved in different physical location. The key issues and solution method are discussed. Based on this, it studies the application method of sample data and analyzes some kinds of using cases, which could lay the foundation for sample data's application. Particularly, sample data locating in Shaanxi province are selected for verifying the method. At the same time, it takes 10 first-level classes which defined in the land cover classification system for example, and analyzes the spatial distribution and density characteristics of all kinds of sample data. The results verify that the method of database construction which is based on relational database with distributed file system is very useful and applicative for sample data's searching, analyzing and promoted application. Furthermore, sample data collected in the project of China's First National Geographic Conditions Census could be useful in the earth observation and land cover's quality assessment.

  19. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods.

    PubMed

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-06-01

    Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman's rank correlation coefficient and Blomqvist's measure, and compared them with Pearson's correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson's correlation, Spearman's rank correlation, and Blomqvist's coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist's coefficient was not confirmed by visual methods. Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data.

  20. A nonparametric significance test for sampled networks.

    PubMed

    Elliott, Andrew; Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Our work is motivated by an interest in constructing a protein-protein interaction network that captures key features associated with Parkinson's disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein-protein interaction network. The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  1. A nonparametric significance test for sampled networks

    PubMed Central

    Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Abstract Motivation Our work is motivated by an interest in constructing a protein–protein interaction network that captures key features associated with Parkinson’s disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. Results We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein–protein interaction network. Availability and implementation The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. Contact ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036452

  2. Sub Surface Geoelectrical Imaging for Potential Geohazard in Infrastructure Construction in Sidoarjo, East Java

    NASA Astrophysics Data System (ADS)

    Sumintadireja, Prihadi; Irawan, Diky

    2017-06-01

    Mud volcano remnants are identified in Surabaya and adjacent areas. The people in East Java based on historical report are custom and able to adjust with the natural phenomena within their areas. Sidoarjo mud volcano phenomena which coincident with drilling activity in 29 May 2006 is making people and government anxious for development a new infrastructure such as high rise building, toll road etc. An understanding of a geological hazard which can be single, sequential or combined events in their origin is the main key importance in subsurface imaging. Geological hazard can be identified by geophysical, geological, geotechnical method. The prompt selection of geophysical method to reveal subsurface condition is very important factor instead of survey design and field data acquisition. Revealing subsurface condition is very important information for site investigation consists of geological, geophysical and geotechnical data, whereas data analysis will help civil engineer design and calculate the construction safety.

  3. Continuum radiation from active galactic nuclei: A statistical study

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.

    1986-01-01

    The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.

  4. Iterative algorithm-guided design of massive strain libraries, applied to itaconic acid production in yeast.

    PubMed

    Young, Eric M; Zhao, Zheng; Gielesen, Bianca E M; Wu, Liang; Benjamin Gordon, D; Roubos, Johannes A; Voigt, Christopher A

    2018-05-09

    Metabolic engineering requires multiple rounds of strain construction to evaluate alternative pathways and enzyme concentrations. Optimizing multigene pathways stepwise or by randomly selecting enzymes and expression levels is inefficient. Here, we apply methods from design of experiments (DOE) to guide the construction of strain libraries from which the maximum information can be extracted without sampling every possible combination. We use Saccharomyces cerevisiae as a host for a novel six-gene pathway to itaconic acid, selected by comparing alternative shunt pathways that bypass the mitochondrial TCA cycle. The pathway is distinctive for the use of acetylating acetaldehyde dehydrogenase to increase cytosolic acetyl-CoA pools, a bacterial enzyme to synthesize citrate in the cytosol, and an itaconic acid exporter. Precise control over the expression of each gene is enabled by a set of promoter-terminator pairs that span a 174-fold range. Two large combinatorial libraries (160 variants, 2.4Mb and 32 variants, 0.6Mb) are designed where the expression levels are selected by statistical methods (I-optimal response surface methodology, full factorial, or Plackett-Burman) with the intent of extracting different types of guiding information after the screen. This is applied to the design of a third library (24 variants, 0.5Mb) intended to alleviate a bottleneck in cis-aconitate decarboxylase (CAD) expression. The top strain produces 815mg/l itaconic acid, a 4-fold improvement over the initial strain achieved by iteratively balancing pathway expression. Including a methylated product in the total, the strain produces 1.3g/l combined itaconic acids. Further, a regression analysis of the libraries reveals the optimal expression level of CAD as well as pairwise interdependencies between genes that result in increased titer and purity of itaconic acid. This work demonstrates adapting algorithmic design strategies to guide automated yeast strain construction and learn information after each iteration. Copyright © 2018. Published by Elsevier Inc.

  5. A training image evaluation and selection method based on minimum data event distance for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Feng, Wenjie; Wu, Shenghe; Yin, Yanshu; Zhang, Jiajia; Zhang, Ke

    2017-07-01

    A training image (TI) can be regarded as a database of spatial structures and their low to higher order statistics used in multiple-point geostatistics (MPS) simulation. Presently, there are a number of methods to construct a series of candidate TIs (CTIs) for MPS simulation based on a modeler's subjective criteria. The spatial structures of TIs are often various, meaning that the compatibilities of different CTIs with the conditioning data are different. Therefore, evaluation and optimal selection of CTIs before MPS simulation is essential. This paper proposes a CTI evaluation and optimal selection method based on minimum data event distance (MDevD). In the proposed method, a set of MDevD properties are established through calculation of the MDevD of conditioning data events in each CTI. Then, CTIs are evaluated and ranked according to the mean value and variance of the MDevD properties. The smaller the mean value and variance of an MDevD property are, the more compatible the corresponding CTI is with the conditioning data. In addition, data events with low compatibility in the conditioning data grid can be located to help modelers select a set of complementary CTIs for MPS simulation. The MDevD property can also help to narrow the range of the distance threshold for MPS simulation. The proposed method was evaluated using three examples: a 2D categorical example, a 2D continuous example, and an actual 3D oil reservoir case study. To illustrate the method, a C++ implementation of the method is attached to the paper.

  6. Defining the Reference Condition for Wadeable Streams in the Sand Hills Subdivision of the Southeastern Plains Ecoregion, USA

    NASA Astrophysics Data System (ADS)

    Kosnicki, Ely; Sefick, Stephen A.; Paller, Michael H.; Jarrell, Miller S.; Prusha, Blair A.; Sterrett, Sean C.; Tuberville, Tracey D.; Feminella, Jack W.

    2014-09-01

    The Sand Hills subdivision of the Southeastern Plains ecoregion has been impacted by historical land uses over the past two centuries and, with the additive effects of contemporary land use, determining reference condition for streams in this region is a challenge. We identified reference condition based on the combined use of 3 independent selection methods. Method 1 involved use of a multivariate disturbance gradient derived from several stressors, method 2 was based on variation in channel morphology, and method 3 was based on passing 6 of 7 environmental criteria. Sites selected as reference from all 3 methods were considered primary reference, whereas those selected by 2 or 1 methods were considered secondary or tertiary reference, respectively. Sites not selected by any of the methods were considered non-reference. In addition, best professional judgment (BPJ) was used to exclude some sites from any reference class, and comparisons were made to examine the utility of BPJ. Non-metric multidimensional scaling indicated that use of BPJ may help designate non-reference sites when unidentified stressors are present. The macroinvertebrate community measures Ephemeroptera, Plecoptera, Trichoptera richness and North Carolina Biotic Index showed no differences between primary and secondary reference sites when BPJ was ignored. However, there was no significant difference among primary, secondary, and tertiary reference sites when BPJ was used. We underscore the importance of classifying reference conditions, especially in regions that have endured significant anthropogenic activity. We suggest that the use of secondary reference sites may enable construction of models that target a broader set of management interests.

  7. Methods comparison for microsatellite marker development: Different isolation methods, different yield efficiency

    NASA Astrophysics Data System (ADS)

    Zhan, Aibin; Bao, Zhenmin; Hu, Xiaoli; Lu, Wei; Hu, Jingjie

    2009-06-01

    Microsatellite markers have become one kind of the most important molecular tools used in various researches. A large number of microsatellite markers are required for the whole genome survey in the fields of molecular ecology, quantitative genetics and genomics. Therefore, it is extremely necessary to select several versatile, low-cost, efficient and time- and labor-saving methods to develop a large panel of microsatellite markers. In this study, we used Zhikong scallop ( Chlamys farreri) as the target species to compare the efficiency of the five methods derived from three strategies for microsatellite marker development. The results showed that the strategy of constructing small insert genomic DNA library resulted in poor efficiency, while the microsatellite-enriched strategy highly improved the isolation efficiency. Although the mining public database strategy is time- and cost-saving, it is difficult to obtain a large number of microsatellite markers, mainly due to the limited sequence data of non-model species deposited in public databases. Based on the results in this study, we recommend two methods, microsatellite-enriched library construction method and FIASCO-colony hybridization method, for large-scale microsatellite marker development. Both methods were derived from the microsatellite-enriched strategy. The experimental results obtained from Zhikong scallop also provide the reference for microsatellite marker development in other species with large genomes.

  8. A constructive nonlinear array (CNA) method for barely visible impact detection in composite materials

    NASA Astrophysics Data System (ADS)

    Malfense Fierro, Gian Piero; Meo, Michele

    2017-04-01

    Currently there are numerous phased array techniques such as Full Matrix Capture (FMC) and Total Focusing Method (TFM) that provide good damage assessment for composite materials. Although, linear methods struggle to evaluate and assess low levels of damage, while nonlinear methods have shown great promise in early damage detection. A sweep and subtraction evaluation method coupled with a constructive nonlinear array method (CNA) is proposed in order to assess damage specific nonlinearities, address issues with frequency selection when using nonlinear ultrasound imaging techniques and reduce equipment generated nonlinearities. These methods were evaluated using multiple excitation locations on an impacted composite panel with a complex damage (barely visible impact damage). According to various recent works, damage excitation can be accentuated by exciting at local defect resonance (LDR) frequencies; although these frequencies are not always easily determinable. The sweep methodology uses broadband excitation to determine both local defect and material resonances, by assessing local defect generated nonlinearities using a laser vibrometer it is possible to assess which frequencies excite the complex geometry of the crack. The dual effect of accurately determining local defect resonances, the use of an image subtraction method and the reduction of equipment based nonlinearities using CNA result in greater repeatability and clearer nonlinear imaging (NIM).

  9. Metrics in method engineering

    NASA Astrophysics Data System (ADS)

    Brinkkemper, S.; Rossi, M.

    1994-12-01

    As customizable computer aided software engineering (CASE) tools, or CASE shells, have been introduced in academia and industry, there has been a growing interest into the systematic construction of methods and their support environments, i.e. method engineering. To aid the method developers and method selectors in their tasks, we propose two sets of metrics, which measure the complexity of diagrammatic specification techniques on the one hand, and of complete systems development methods on the other hand. Proposed metrics provide a relatively fast and simple way to analyze the technique (or method) properties, and when accompanied with other selection criteria, can be used for estimating the cost of learning the technique and the relative complexity of a technique compared to others. To demonstrate the applicability of the proposed metrics, we have applied them to 34 techniques and 15 methods.

  10. Body Awareness: Construct and Self-Report Measures

    PubMed Central

    Mehling, Wolf E.; Gopisetty, Viranjini; Daubenmier, Jennifer; Price, Cynthia J.; Hecht, Frederick M.; Stewart, Anita

    2009-01-01

    Objectives Heightened body awareness can be adaptive and maladaptive. Improving body awareness has been suggested as an approach for treating patients with conditions such as chronic pain, obesity and post-traumatic stress disorder. We assessed the psychometric quality of selected self-report measures and examined their items for underlying definitions of the construct. Data sources PubMed, PsychINFO, HaPI, Embase, Digital Dissertations Database. Review methods Abstracts were screened; potentially relevant instruments were obtained and systematically reviewed. Instruments were excluded if they exclusively measured anxiety, covered emotions without related physical sensations, used observer ratings only, or were unobtainable. We restricted our study to the proprioceptive and interoceptive channels of body awareness. The psychometric properties of each scale were rated using a structured evaluation according to the method of McDowell. Following a working definition of the multi-dimensional construct, an inter-disciplinary team systematically examined the items of existing body awareness instruments, identified the dimensions queried and used an iterative qualitative process to refine the dimensions of the construct. Results From 1,825 abstracts, 39 instruments were screened. 12 were included for psychometric evaluation. Only two were rated as high standard for reliability, four for validity. Four domains of body awareness with 11 sub-domains emerged. Neither a single nor a compilation of several instruments covered all dimensions. Key domains that might potentially differentiate adaptive and maladaptive aspects of body awareness were missing in the reviewed instruments. Conclusion Existing self-report instruments do not address important domains of the construct of body awareness, are unable to discern between adaptive and maladaptive aspects of body awareness, or exhibit other psychometric limitations. Restricting the construct to its proprio- and interoceptive channels, we explore the current understanding of the multi-dimensional construct and suggest next steps for further research. PMID:19440300

  11. Rock Slope Stability Evaluation in a Steep-Walled Canyon: Application to Elevator Construction in the Yunlong River Valley, Enshi, China

    NASA Astrophysics Data System (ADS)

    Xiao, Lili; Chai, Bo; Yin, Kunlong

    2015-09-01

    A passenger elevator is to be built on a nearly vertical slope in the National Geological Park in Enshi, Hubei province, China. Three steps comprise the construction: excavating the slope toe for the elevator platform, building the elevator on the platform, and affixing the elevator to the slope using anchors. To evaluate the rock slope stability in the elevator area and the safety of the elevator construction, we applied three techniques: qualitative analysis, formula calculation, and numerical simulation methods, based on field investigation and parameter selection, and considering both wet and dry conditions, pre- and post-construction. Qualitative stability factors for sliding and falling were calculated using the limit equilibrium method; the results show that the slope as a whole is stable, with a few unstable blocks, notably block BT1. Formula-based stability factors were calculated for four sections on block BT1, revealing the following: anchors will decrease the stability of certain rock pieces; the lowest average stability factor after anchoring will be K f = 1.36 in wet conditions; block BT1 should be reinforced during elevator construction, up to a first-class slope stability factor of K f = 1.40; and the slope as a whole is stable. Numerical simulation using FLAC3D indicated that the stress distribution will reach equilibrium for all steps before and after construction, and that the factor of safety (FOS) is within the general slope safety range (FOS > 1.05). We suggest that unstable pieces in block BT1 be reinforced during construction to a first-class slope safety range (FOS > 1.3), and that deformation monitoring on the slope surface be implemented.

  12. Ordinal feature selection for iris and palmprint recognition.

    PubMed

    Sun, Zhenan; Wang, Libin; Tan, Tieniu

    2014-09-01

    Ordinal measures have been demonstrated as an effective feature representation model for iris and palmprint recognition. However, ordinal measures are a general concept of image analysis and numerous variants with different parameter settings, such as location, scale, orientation, and so on, can be derived to construct a huge feature space. This paper proposes a novel optimization formulation for ordinal feature selection with successful applications to both iris and palmprint recognition. The objective function of the proposed feature selection method has two parts, i.e., misclassification error of intra and interclass matching samples and weighted sparsity of ordinal feature descriptors. Therefore, the feature selection aims to achieve an accurate and sparse representation of ordinal measures. And, the optimization subjects to a number of linear inequality constraints, which require that all intra and interclass matching pairs are well separated with a large margin. Ordinal feature selection is formulated as a linear programming (LP) problem so that a solution can be efficiently obtained even on a large-scale feature pool and training database. Extensive experimental results demonstrate that the proposed LP formulation is advantageous over existing feature selection methods, such as mRMR, ReliefF, Boosting, and Lasso for biometric recognition, reporting state-of-the-art accuracy on CASIA and PolyU databases.

  13. Recovering faces from memory: the distracting influence of external facial features.

    PubMed

    Frowd, Charlie D; Skelton, Faye; Atherton, Chris; Pitchford, Melanie; Hepton, Gemma; Holden, Laura; McIntyre, Alex H; Hancock, Peter J B

    2012-06-01

    Recognition memory for unfamiliar faces is facilitated when contextual cues (e.g., head pose, background environment, hair and clothing) are consistent between study and test. By contrast, inconsistencies in external features, especially hair, promote errors in unfamiliar face-matching tasks. For the construction of facial composites, as carried out by witnesses and victims of crime, the role of external features (hair, ears, and neck) is less clear, although research does suggest their involvement. Here, over three experiments, we investigate the impact of external features for recovering facial memories using a modern, recognition-based composite system, EvoFIT. Participant-constructors inspected an unfamiliar target face and, one day later, repeatedly selected items from arrays of whole faces, with "breeding," to "evolve" a composite with EvoFIT; further participants (evaluators) named the resulting composites. In Experiment 1, the important internal-features (eyes, brows, nose, and mouth) were constructed more identifiably when the visual presence of external features was decreased by Gaussian blur during construction: higher blur yielded more identifiable internal-features. In Experiment 2, increasing the visible extent of external features (to match the target's) in the presented face-arrays also improved internal-features quality, although less so than when external features were masked throughout construction. Experiment 3 demonstrated that masking external-features promoted substantially more identifiable images than using the previous method of blurring external-features. Overall, the research indicates that external features are a distractive rather than a beneficial cue for face construction; the results also provide a much better method to construct composites, one that should dramatically increase identification of offenders.

  14. Discriminative spatial-frequency-temporal feature extraction and classification of motor imagery EEG: An sparse regression and Weighted Naïve Bayesian Classifier-based approach.

    PubMed

    Miao, Minmin; Zeng, Hong; Wang, Aimin; Zhao, Changsen; Liu, Feixiang

    2017-02-15

    Common spatial pattern (CSP) is most widely used in motor imagery based brain-computer interface (BCI) systems. In conventional CSP algorithm, pairs of the eigenvectors corresponding to both extreme eigenvalues are selected to construct the optimal spatial filter. In addition, an appropriate selection of subject-specific time segments and frequency bands plays an important role in its successful application. This study proposes to optimize spatial-frequency-temporal patterns for discriminative feature extraction. Spatial optimization is implemented by channel selection and finding discriminative spatial filters adaptively on each time-frequency segment. A novel Discernibility of Feature Sets (DFS) criteria is designed for spatial filter optimization. Besides, discriminative features located in multiple time-frequency segments are selected automatically by the proposed sparse time-frequency segment common spatial pattern (STFSCSP) method which exploits sparse regression for significant features selection. Finally, a weight determined by the sparse coefficient is assigned for each selected CSP feature and we propose a Weighted Naïve Bayesian Classifier (WNBC) for classification. Experimental results on two public EEG datasets demonstrate that optimizing spatial-frequency-temporal patterns in a data-driven manner for discriminative feature extraction greatly improves the classification performance. The proposed method gives significantly better classification accuracies in comparison with several competing methods in the literature. The proposed approach is a promising candidate for future BCI systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Relationship between Frequency and Intensity of Cigarette Smoking and TTFC/C among Students of the GYTS in Select Countries, 2007-2009

    ERIC Educational Resources Information Center

    Lam, Eugene; Giovino, Gary A.; Shin, Mikyong; Lee, Kyung A.; Rolle, Italia; Asma, Samira

    2014-01-01

    Background: This study assessed the construct validity of a measure of nicotine dependence that was used in the Global Youth Tobacco Survey (GYTS). Methods: Using 2007-2009 data from the GYTS, subjects from 6 countries were used to assess current smokers' odds of reporting time to first cigarette or craving positive (TTFC/C+) by the number of…

  16. Chapter 3: Selecting materials for mine soil construction when establishing forests on Appalachian mined lands

    Treesearch

    Jeff Skousen; Carl Zipper; Jim Burger; Christopher Barton; Patrick. Angel

    2017-01-01

    The Forestry Reclamation Approach (FRA), a method for reclaiming coal-mined land to forest (Chapter 2, this volume), is based on research, knowledge, and experience of forest soil scientists and reclamation practitioners. Step 1 of the FRA is to create a suitable rooting medium for good tree growth that is no less than 4 feet deep and consists of topsoil, weathered...

  17. Fire Safety Aspects of Polymeric Materials. Volume 6. Aircraft. Civil and Military

    DTIC Science & Technology

    1977-01-01

    resistance of the existing Polyurethane foam- based seating sys- tems be improved through design, construction, and selection of covering materials. 12. A...aircraft interiors under real fire conditions. To provide the data base for developing improved fire safety standards for aircraft, four types of...the determination of immobilizing effect was based on performance in the swimming test, a simple exercise method also favored by Kimmerle to provide

  18. A flexible computational framework for detecting, characterizing, and interpreting statistical patterns of epistasis in genetic studies of human disease susceptibility.

    PubMed

    Moore, Jason H; Gilbert, Joshua C; Tsai, Chia-Ti; Chiang, Fu-Tien; Holden, Todd; Barney, Nate; White, Bill C

    2006-07-21

    Detecting, characterizing, and interpreting gene-gene interactions or epistasis in studies of human disease susceptibility is both a mathematical and a computational challenge. To address this problem, we have previously developed a multifactor dimensionality reduction (MDR) method for collapsing high-dimensional genetic data into a single dimension (i.e. constructive induction) thus permitting interactions to be detected in relatively small sample sizes. In this paper, we describe a comprehensive and flexible framework for detecting and interpreting gene-gene interactions that utilizes advances in information theory for selecting interesting single-nucleotide polymorphisms (SNPs), MDR for constructive induction, machine learning methods for classification, and finally graphical models for interpretation. We illustrate the usefulness of this strategy using artificial datasets simulated from several different two-locus and three-locus epistasis models. We show that the accuracy, sensitivity, specificity, and precision of a naïve Bayes classifier are significantly improved when SNPs are selected based on their information gain (i.e. class entropy removed) and reduced to a single attribute using MDR. We then apply this strategy to detecting, characterizing, and interpreting epistatic models in a genetic study (n = 500) of atrial fibrillation and show that both classification and model interpretation are significantly improved.

  19. Enantioselective Synthesis of SNAP-7941

    PubMed Central

    Goss, Jennifer M.; Schaus, Scott E.

    2009-01-01

    An enantioselective synthesis of SNAP-7941, a potent melanin concentrating hormone receptor antagonist, was achieved using two organocatalytic methods. The first method utilized to synthesize the enantioenriched dihydropyrimidone core was the Cinchona alkaloid-catalyzed Mannich reaction of β-keto esters to acyl imines and the second was chiral phosphoric acid-catalyzed Biginelli reaction. Completion of the synthesis was accomplished via selective urea formation at the N3 position of the dihydropyrimidone with the 3-(4-phenylpiperidin-1-yl)propyl amine side chain fragment. The synthesis of SNAP-7921 highlights the utility of asymmetric organocatalytic methods in the construction of an important class of chiral heterocycles. PMID:18767801

  20. A method for simplifying the analysis of traffic accidents injury severity on two-lane highways using Bayesian networks.

    PubMed

    Mujalli, Randa Oqab; de Oña, Juan

    2011-10-01

    This study describes a method for reducing the number of variables frequently considered in modeling the severity of traffic accidents. The method's efficiency is assessed by constructing Bayesian networks (BN). It is based on a two stage selection process. Several variable selection algorithms, commonly used in data mining, are applied in order to select subsets of variables. BNs are built using the selected subsets and their performance is compared with the original BN (with all the variables) using five indicators. The BNs that improve the indicators' values are further analyzed for identifying the most significant variables (accident type, age, atmospheric factors, gender, lighting, number of injured, and occupant involved). A new BN is built using these variables, where the results of the indicators indicate, in most of the cases, a statistically significant improvement with respect to the original BN. It is possible to reduce the number of variables used to model traffic accidents injury severity through BNs without reducing the performance of the model. The study provides the safety analysts a methodology that could be used to minimize the number of variables used in order to determine efficiently the injury severity of traffic accidents without reducing the performance of the model. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Total sulfur determination in residues of crude oil distillation using FT-IR/ATR and variable selection methods.

    PubMed

    Müller, Aline Lima Hermes; Picoloto, Rochele Sogari; de Azevedo Mello, Paola; Ferrão, Marco Flores; de Fátima Pereira dos Santos, Maria; Guimarães, Regina Célia Lourenço; Müller, Edson Irineu; Flores, Erico Marlon Moraes

    2012-04-01

    Total sulfur concentration was determined in atmospheric residue (AR) and vacuum residue (VR) samples obtained from petroleum distillation process by Fourier transform infrared spectroscopy with attenuated total reflectance (FT-IR/ATR) in association with chemometric methods. Calibration and prediction set consisted of 40 and 20 samples, respectively. Calibration models were developed using two variable selection models: interval partial least squares (iPLS) and synergy interval partial least squares (siPLS). Different treatments and pre-processing steps were also evaluated for the development of models. The pre-treatment based on multiplicative scatter correction (MSC) and the mean centered data were selected for models construction. The use of siPLS as variable selection method provided a model with root mean square error of prediction (RMSEP) values significantly better than those obtained by PLS model using all variables. The best model was obtained using siPLS algorithm with spectra divided in 20 intervals and combinations of 3 intervals (911-824, 823-736 and 737-650 cm(-1)). This model produced a RMSECV of 400 mg kg(-1) S and RMSEP of 420 mg kg(-1) S, showing a correlation coefficient of 0.990. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Role of physicochemical properties in the activation of peroxisome proliferator-activated receptor δ.

    PubMed

    Maltarollo, Vinícius G; Homem-de-Mello, Paula; Honorio, Káthia M

    2011-10-01

    Current researches on treatments for metabolic diseases involve a class of biological receptors called peroxisome proliferator-activated receptors (PPARs), which control the metabolism of carbohydrates and lipids. A subclass of these receptors, PPARδ, regulates several metabolic processes, and the substances that activate them are being studied as new drug candidates for the treatment of diabetes mellitus and metabolic syndrome. In this study, several PPARδ agonists with experimental biological activity were selected for a structural and chemical study. Electronic, stereochemical, lipophilic and topological descriptors were calculated for the selected compounds using various theoretical methods, such as density functional theory (DFT). Fisher's weight and principal components analysis (PCA) methods were employed to select the most relevant variables for this study. The partial least squares (PLS) method was used to construct the multivariate statistical model, and the best model obtained had 4 PCs, q ( 2 ) = 0.80 and r ( 2 ) = 0.90, indicating a good internal consistency. The prediction residues calculated for the compounds in the test set had low values, indicating the good predictive capability of our PLS model. The model obtained in this study is reliable and can be used to predict the biological activity of new untested compounds. Docking studies have also confirmed the importance of the molecular descriptors selected for this system.

  3. Report on Subway Tunneling Needs of 13 Selected U.S. Cities, 1971-75

    DOT National Transportation Integrated Search

    1972-06-01

    This report establishes proposed subway tunneling construction needs for thirteen selected U.S. cities during 1971-75 as given by the transit authorities. This information will be used to estimate the demand for subway tunnel construction. This deman...

  4. Triage for action: Systematic assessment and dissemination of construction health and safety research.

    PubMed

    Baker, Robin; Chang, Charlotte; Bunting, Jessica; Betit, Eileen

    2015-08-01

    Research translation too often relies on passive methods that fail to reach those who can impact the workplace. The need for better research to practice (r2p) approaches is especially pressing in construction, where a disproportionate number of workers suffer serious injury illness. A triage process was designed and used to systematically review completed research, assess r2p readiness, establish priorities, and launch dissemination follow-up efforts. A mixed quantitative and qualitative approach was used. The process proved effective in ensuring that significant findings and evidence-based solutions are disseminated actively. Key factors emerged in the selection of follow-up priorities, including availability of partners able to reach end users, windows of opportunity, and cross-cutting approaches that can benefit multiple dissemination efforts. Use of a systematic triage process may have an important role to play in building r2p capacity in construction safety and health. © 2015 Wiley Periodicals, Inc.

  5. Heuristic Analysis Model of Nitrided Layers’ Formation Consisting of the Image Processing and Analysis and Elements of Artificial Intelligence

    PubMed Central

    Wójcicki, Tomasz; Nowicki, Michał

    2016-01-01

    The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389

  6. Study of the thermal properties of selected PCMs for latent heat storage in buildings

    NASA Astrophysics Data System (ADS)

    Valentova, Katerina; Pechackova, Katerina; Prikryl, Radek; Ostry, Milan; Zmeskal, Oldrich

    2017-07-01

    The paper is focused on measurements of thermal properties of selected phase change materials (PCMs) which can be used for latent heat storage in building structures. The thermal properties were measured by the transient step-wise method and analyzed by the thermal spectroscopy. The results of three different materials (RT18HC, RT28HC, and RT35HC) and their thermal properties in solid, liquid, and phase change region were determined. They were correlated with the differential scanning calorimetry (DSC) measurement. The results will be used to determine the optimum ratio of components for the construction of drywall and plasters containing listed ingredients, respectively.

  7. Intensification of constructed wetlands for land area reduction: a review.

    PubMed

    Ilyas, Huma; Masih, Ilyas

    2017-05-01

    The large land area requirement of constructed wetlands (CWs) is a major limitation of its application especially in densely populated and mountainous areas. This review paper provides insights on different strategies applied for the reduction of land area including stack design and intensification of CWs with different aeration methods. The impacts of different aeration methods on the performance and land area reduction were extensively and critically evaluated for nine wetland systems under three aeration strategies such as tidal flow (TF), effluent recirculation (ER), and artificial aeration (AA) applied on three types of CWs including vertical flow constructed wetland (VFCW), horizontal flow constructed wetland (HFCW), and hybrid constructed wetland (HCW). The area reduction and pollutant removal efficiency showed substantial variation among different types of CWs and aeration strategies. The ER-VFCW designated the smallest footprint of 1.1 ± 0.5 m 2 PE -1 (population equivalent) followed by TF-VFCW with the footprint of 2.1 ± 1.8 m 2 PE -1 , and the large footprint was of AA-HFCW (7.8 ± 4.7 m 2 PE -1 ). When footprint and removal efficiency both are the major indicators for the selection of wetland type, the best options for practical application could be TF-VFCW, ER-HCW, and AA-HCW. The data and results outlined in this review could be instructive for futures studies and practical applications of CWs for wastewater treatment, especially in land-limited regions.

  8. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods

    PubMed Central

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-01-01

    Background Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. Objectives The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. Materials and Methods In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman’s rank correlation coefficient and Blomqvist’s measure, and compared them with Pearson’s correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson’s correlation, Spearman’s rank correlation, and Blomqvist’s coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Results Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist’s coefficient was not confirmed by visual methods. Conclusions Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data. PMID:27621916

  9. Optimal information networks: Application for data-driven integrated health in populations

    PubMed Central

    Servadio, Joseph L.; Convertino, Matteo

    2018-01-01

    Development of composite indicators for integrated health in populations typically relies on a priori assumptions rather than model-free, data-driven evidence. Traditional variable selection processes tend not to consider relatedness and redundancy among variables, instead considering only individual correlations. In addition, a unified method for assessing integrated health statuses of populations is lacking, making systematic comparison among populations impossible. We propose the use of maximum entropy networks (MENets) that use transfer entropy to assess interrelatedness among selected variables considered for inclusion in a composite indicator. We also define optimal information networks (OINs) that are scale-invariant MENets, which use the information in constructed networks for optimal decision-making. Health outcome data from multiple cities in the United States are applied to this method to create a systemic health indicator, representing integrated health in a city. PMID:29423440

  10. Orthogonal Array Testing for Transmit Precoding based Codebooks in Space Shift Keying Systems

    NASA Astrophysics Data System (ADS)

    Al-Ansi, Mohammed; Alwee Aljunid, Syed; Sourour, Essam; Mat Safar, Anuar; Rashidi, C. B. M.

    2018-03-01

    In Space Shift Keying (SSK) systems, transmit precoding based codebook approaches have been proposed to improve the performance in limited feedback channels. The receiver performs an exhaustive search in a predefined Full-Combination (FC) codebook to select the optimal codeword that maximizes the Minimum Euclidean Distance (MED) between the received constellations. This research aims to reduce the codebook size with the purpose of minimizing the selection time and the number of feedback bits. Therefore, we propose to construct the codebooks based on Orthogonal Array Testing (OAT) methods due to their powerful inherent properties. These methods allow to acquire a short codebook where the codewords are sufficient to cover almost all the possible effects included in the FC codebook. Numerical results show the effectiveness of the proposed OAT codebooks in terms of the system performance and complexity.

  11. Factors influencing tests of auditory processing: a perspective on current issues and relevant concerns.

    PubMed

    Cacace, Anthony T; McFarland, Dennis J

    2013-01-01

    Tests of auditory perception, such as those used in the assessment of central auditory processing disorders ([C]APDs), represent a domain in audiological assessment where measurement of this theoretical construct is often confounded by nonauditory abilities due to methodological shortcomings. These confounds include the effects of cognitive variables such as memory and attention and suboptimal testing paradigms, including the use of verbal reproduction as a form of response selection. We argue that these factors need to be controlled more carefully and/or modified so that their impact on tests of auditory and visual perception is only minimal. To advocate for a stronger theoretical framework than currently exists and to suggest better methodological strategies to improve assessment of auditory processing disorders (APDs). Emphasis is placed on adaptive forced-choice psychophysical methods and the use of matched tasks in multiple sensory modalities to achieve these goals. Together, this approach has potential to improve the construct validity of the diagnosis, enhance and develop theory, and evolve into a preferred method of testing. Examination of methods commonly used in studies of APDs. Where possible, currently used methodology is compared to contemporary psychophysical methods that emphasize computer-controlled forced-choice paradigms. In many cases, the procedures used in studies of APD introduce confounding factors that could be minimized if computer-controlled forced-choice psychophysical methods were utilized. Ambiguities of interpretation, indeterminate diagnoses, and unwanted confounds can be avoided by minimizing memory and attentional demands on the input end and precluding the use of response-selection strategies that use complex motor processes on the output end. Advocated are the use of computer-controlled forced-choice psychophysical paradigms in combination with matched tasks in multiple sensory modalities to enhance the prospect of obtaining a valid diagnosis. American Academy of Audiology.

  12. The niche construction perspective: a critical appraisal.

    PubMed

    Scott-Phillips, Thomas C; Laland, Kevin N; Shuker, David M; Dickins, Thomas E; West, Stuart A

    2014-05-01

    Niche construction refers to the activities of organisms that bring about changes in their environments, many of which are evolutionarily and ecologically consequential. Advocates of niche construction theory (NCT) believe that standard evolutionary theory fails to recognize the full importance of niche construction, and consequently propose a novel view of evolution, in which niche construction and its legacy over time (ecological inheritance) are described as evolutionary processes, equivalent in importance to natural selection. Here, we subject NCT to critical evaluation, in the form of a collaboration between one prominent advocate of NCT, and a team of skeptics. We discuss whether niche construction is an evolutionary process, whether NCT obscures or clarifies how natural selection leads to organismal adaptation, and whether niche construction and natural selection are of equivalent explanatory importance. We also consider whether the literature that promotes NCT overstates the significance of niche construction, whether it is internally coherent, and whether it accurately portrays standard evolutionary theory. Our disagreements reflect a wider dispute within evolutionary theory over whether the neo-Darwinian synthesis is in need of reformulation, as well as different usages of some key terms (e.g., evolutionary process). © 2013 The Author(s). Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.

  13. Selective targeting of melanoma by PEG-masked protein-based multifunctional nanoparticles

    PubMed Central

    Vannucci, Luca; Falvo, Elisabetta; Fornara, Manuela; Di Micco, Patrizio; Benada, Oldrich; Krizan, Jiri; Svoboda, Jan; Hulikova-Capkova, Katarina; Morea, Veronica; Boffi, Alberto; Ceci, Pierpaolo

    2012-01-01

    Background Nanoparticle-based systems are promising for the development of imaging and therapeutic agents. The main advantage of nanoparticles over traditional systems lies in the possibility of loading multiple functionalities onto a single molecule, which are useful for therapeutic and/or diagnostic purposes. These functionalities include targeting moieties which are able to recognize receptors overexpressed by specific cells and tissues. However, targeted delivery of nanoparticles requires an accurate system design. We present here a rationally designed, genetically engineered, and chemically modified protein-based nanoplatform for cell/tissue-specific targeting. Methods Our nanoparticle constructs were based on the heavy chain of the human protein ferritin (HFt), a highly symmetrical assembly of 24 subunits enclosing a hollow cavity. HFt-based nanoparticles were produced using both genetic engineering and chemical functionalization methods to impart several functionalities, ie, the α-melanocyte-stimulating hormone peptide as a melanoma-targeting moiety, stabilizing and HFt-masking polyethylene glycol molecules, rhodamine fluorophores, and magnetic resonance imaging agents. The constructs produced were extensively characterized by a number of physicochemical techniques, and assayed for selective melanoma-targeting in vitro and in vivo. Results Our HFt-based nanoparticle constructs functionalized with the α-melanocyte-stimulating hormone peptide moiety and polyethylene glycol molecules were specifically taken up by melanoma cells but not by other cancer cell types in vitro. Moreover, experiments in melanoma-bearing mice indicate that these constructs have an excellent tumor-targeting profile and a long circulation time in vivo. Conclusion By masking human HFt with polyethylene glycol and targeting it with an α-melanocyte-stimulating hormone peptide, we developed an HFt-based melanoma-targeting nanoplatform for application in melanoma diagnosis and treatment. These results could be of general interest, because the same strategy can be exploited to develop ad hoc nanoplatforms for specific delivery towards any cell/tissue type for which a suitable targeting moiety is available. PMID:22619508

  14. Inferring gene dependency network specific to phenotypic alteration based on gene expression data and clinical information of breast cancer.

    PubMed

    Zhou, Xionghui; Liu, Juan

    2014-01-01

    Although many methods have been proposed to reconstruct gene regulatory network, most of them, when applied in the sample-based data, can not reveal the gene regulatory relations underlying the phenotypic change (e.g. normal versus cancer). In this paper, we adopt phenotype as a variable when constructing the gene regulatory network, while former researches either neglected it or only used it to select the differentially expressed genes as the inputs to construct the gene regulatory network. To be specific, we integrate phenotype information with gene expression data to identify the gene dependency pairs by using the method of conditional mutual information. A gene dependency pair (A,B) means that the influence of gene A on the phenotype depends on gene B. All identified gene dependency pairs constitute a directed network underlying the phenotype, namely gene dependency network. By this way, we have constructed gene dependency network of breast cancer from gene expression data along with two different phenotype states (metastasis and non-metastasis). Moreover, we have found the network scale free, indicating that its hub genes with high out-degrees may play critical roles in the network. After functional investigation, these hub genes are found to be biologically significant and specially related to breast cancer, which suggests that our gene dependency network is meaningful. The validity has also been justified by literature investigation. From the network, we have selected 43 discriminative hubs as signature to build the classification model for distinguishing the distant metastasis risks of breast cancer patients, and the result outperforms those classification models with published signatures. In conclusion, we have proposed a promising way to construct the gene regulatory network by using sample-based data, which has been shown to be effective and accurate in uncovering the hidden mechanism of the biological process and identifying the gene signature for phenotypic change.

  15. Selective classification and quantification model of C&D waste from material resources consumed in residential building construction.

    PubMed

    Mercader-Moyano, Pilar; Ramírez-de-Arellano-Agudo, Antonio

    2013-05-01

    The unfortunate economic situation involving Spain and the European Union is, among other factors, the result of intensive construction activity over recent years. The excessive consumption of natural resources, together with the impact caused by the uncontrolled dumping of untreated C&D waste in illegal landfills have caused environmental pollution and a deterioration of the landscape. The objective of this research was to generate a selective classification and quantification model of C&D waste based on the material resources consumed in the construction of residential buildings, either new or renovated, namely the Conventional Constructive Model (CCM). A practical example carried out on ten residential buildings in Seville, Spain, enabled the identification and quantification of the C&D waste generated in their construction and the origin of the waste, in terms of the building material from which it originated and its impact for every m(2) constructed. This model enables other researchers to establish comparisons between the various improvements proposed for the minimization of the environmental impact produced by building a CCM, new corrective measures to be proposed in future policies that regulate the production and management of C&D waste generated in construction from the design stage to the completion of the construction process, and the establishment of sustainable management for C&D waste and for the selection of materials for the construction on projected or renovated buildings.

  16. Discrimination and prediction of the origin of Chinese and Korean soybeans using Fourier transform infrared spectrometry (FT-IR) with multivariate statistical analysis

    PubMed Central

    Lee, Byeong-Ju; Zhou, Yaoyao; Lee, Jae Soung; Shin, Byeung Kon; Seo, Jeong-Ah; Lee, Doyup; Kim, Young-Suk

    2018-01-01

    The ability to determine the origin of soybeans is an important issue following the inclusion of this information in the labeling of agricultural food products becoming mandatory in South Korea in 2017. This study was carried out to construct a prediction model for discriminating Chinese and Korean soybeans using Fourier-transform infrared (FT-IR) spectroscopy and multivariate statistical analysis. The optimal prediction models for discriminating soybean samples were obtained by selecting appropriate scaling methods, normalization methods, variable influence on projection (VIP) cutoff values, and wave-number regions. The factors for constructing the optimal partial-least-squares regression (PLSR) prediction model were using second derivatives, vector normalization, unit variance scaling, and the 4000–400 cm–1 region (excluding water vapor and carbon dioxide). The PLSR model for discriminating Chinese and Korean soybean samples had the best predictability when a VIP cutoff value was not applied. When Chinese soybean samples were identified, a PLSR model that has the lowest root-mean-square error of the prediction value was obtained using a VIP cutoff value of 1.5. The optimal PLSR prediction model for discriminating Korean soybean samples was also obtained using a VIP cutoff value of 1.5. This is the first study that has combined FT-IR spectroscopy with normalization methods, VIP cutoff values, and selected wave-number regions for discriminating Chinese and Korean soybeans. PMID:29689113

  17. Effects of surface anchoring on the electric Frederiks transition in ferronematic systems

    NASA Astrophysics Data System (ADS)

    Farrokhbin, Mojtaba; Kadivar, Erfan

    2016-11-01

    The effects of anchoring phenomenon on the electric Frederiks transition threshold field in a nematic liquid crystal doped with ferroelectric nanoparticles are discussed. The polarizability of these nanoparticles in combination with confinement effects cause the drastic effects on the ferronematic systems. This study is based on Frank free energy and Rapini-Papoular surface energy for ferronematic liquid crystal having finite anchoring condition. In the case of different anchoring boundary conditions, the Euler-Lagrange equation of the total free energy is numerically solved by using the finite difference method together with the relaxation method and Maxwell construction to select the physical solutions and therefore investigate the effects of different anchoring strengths on the Frederiks transition threshold field. Maxwell construction method is employed to select three periodic solutions for nematic liquid crystal director at the interfaces of a slab. In the interval from zero to half- π, there is only one solution for the director orientation. In this way, NLC director rotates toward the normal to the surface as the applied electric field increases at the walls. Our numerical results illustrate that above Frederiks transition and in the intermediate anchoring strength, nematic molecules illustrate the different orientation at slab boundaries. We also study the effects of different anchoring strengths, nanoparticle volume fractions and polarizations on the Frederiks transition threshold field. We report that decreasing in the nanoparticle polarization results in the saturation Frederiks threshold. However, this situation does not happen for the nanoparticles volume fraction.

  18. Properties of healthcare teaming networks as a function of network construction algorithms.

    PubMed

    Zand, Martin S; Trayhan, Melissa; Farooq, Samir A; Fucile, Christopher; Ghoshal, Gourab; White, Robert J; Quill, Caroline M; Rosenberg, Alexander; Barbosa, Hugo Serrano; Bush, Kristen; Chafi, Hassan; Boudreau, Timothy

    2017-01-01

    Network models of healthcare systems can be used to examine how providers collaborate, communicate, refer patients to each other, and to map how patients traverse the network of providers. Most healthcare service network models have been constructed from patient claims data, using billing claims to link a patient with a specific provider in time. The data sets can be quite large (106-108 individual claims per year), making standard methods for network construction computationally challenging and thus requiring the use of alternate construction algorithms. While these alternate methods have seen increasing use in generating healthcare networks, there is little to no literature comparing the differences in the structural properties of the generated networks, which as we demonstrate, can be dramatically different. To address this issue, we compared the properties of healthcare networks constructed using different algorithms from 2013 Medicare Part B outpatient claims data. Three different algorithms were compared: binning, sliding frame, and trace-route. Unipartite networks linking either providers or healthcare organizations by shared patients were built using each method. We find that each algorithm produced networks with substantially different topological properties, as reflected by numbers of edges, network density, assortativity, clustering coefficients and other structural measures. Provider networks adhered to a power law, while organization networks were best fit by a power law with exponential cutoff. Censoring networks to exclude edges with less than 11 shared patients, a common de-identification practice for healthcare network data, markedly reduced edge numbers and network density, and greatly altered measures of vertex prominence such as the betweenness centrality. Data analysis identified patterns in the distance patients travel between network providers, and a striking set of teaming relationships between providers in the Northeast United States and Florida, likely due to seasonal residence patterns of Medicare beneficiaries. We conclude that the choice of network construction algorithm is critical for healthcare network analysis, and discuss the implications of our findings for selecting the algorithm best suited to the type of analysis to be performed.

  19. Fuzzy portfolio model with fuzzy-input return rates and fuzzy-output proportions

    NASA Astrophysics Data System (ADS)

    Tsaur, Ruey-Chyn

    2015-02-01

    In the finance market, a short-term investment strategy is usually applied in portfolio selection in order to reduce investment risk; however, the economy is uncertain and the investment period is short. Further, an investor has incomplete information for selecting a portfolio with crisp proportions for each chosen security. In this paper we present a new method of constructing fuzzy portfolio model for the parameters of fuzzy-input return rates and fuzzy-output proportions, based on possibilistic mean-standard deviation models. Furthermore, we consider both excess or shortage of investment in different economic periods by using fuzzy constraint for the sum of the fuzzy proportions, and we also refer to risks of securities investment and vagueness of incomplete information during the period of depression economics for the portfolio selection. Finally, we present a numerical example of a portfolio selection problem to illustrate the proposed model and a sensitivity analysis is realised based on the results.

  20. Precise Spatially Selective Photothermolysis Using Modulated Femtosecond Lasers and Real-time Multimodal Microscopy Monitoring.

    PubMed

    Huang, Yimei; Lui, Harvey; Zhao, Jianhua; Wu, Zhenguo; Zeng, Haishan

    2017-01-01

    The successful application of lasers in the treatment of skin diseases and cosmetic surgery is largely based on the principle of conventional selective photothermolysis which relies strongly on the difference in the absorption between the therapeutic target and its surroundings. However, when the differentiation in absorption is not sufficient, collateral damage would occur due to indiscriminate and nonspecific tissue heating. To deal with such cases, we introduce a novel spatially selective photothermolysis method based on multiphoton absorption in which the radiant energy of a tightly focused near-infrared femtosecond laser beam can be directed spatially by aiming the laser focal point to the target of interest. We construct a multimodal optical microscope to perform and monitor the spatially selective photothermolysis. We demonstrate that precise alteration of the targeted tissue is achieved while leaving surrounding tissue intact by choosing appropriate femtosecond laser exposure with multimodal optical microscopy monitoring in real time.

  1. Precise Spatially Selective Photothermolysis Using Modulated Femtosecond Lasers and Real-time Multimodal Microscopy Monitoring

    PubMed Central

    Huang, Yimei; Lui, Harvey; Zhao, Jianhua; Wu, Zhenguo; Zeng, Haishan

    2017-01-01

    The successful application of lasers in the treatment of skin diseases and cosmetic surgery is largely based on the principle of conventional selective photothermolysis which relies strongly on the difference in the absorption between the therapeutic target and its surroundings. However, when the differentiation in absorption is not sufficient, collateral damage would occur due to indiscriminate and nonspecific tissue heating. To deal with such cases, we introduce a novel spatially selective photothermolysis method based on multiphoton absorption in which the radiant energy of a tightly focused near-infrared femtosecond laser beam can be directed spatially by aiming the laser focal point to the target of interest. We construct a multimodal optical microscope to perform and monitor the spatially selective photothermolysis. We demonstrate that precise alteration of the targeted tissue is achieved while leaving surrounding tissue intact by choosing appropriate femtosecond laser exposure with multimodal optical microscopy monitoring in real time. PMID:28255346

  2. Fundamental Movement Skills Are More than Run, Throw and Catch: The Role of Stability Skills.

    PubMed

    Rudd, James R; Barnett, Lisa M; Butson, Michael L; Farrow, Damian; Berry, Jason; Polman, Remco C J

    2015-01-01

    In motor development literature fundamental movement skills are divided into three constructs: locomotive, object control and stability skills. Most fundamental movement skills research has focused on children's competency in locomotor and object control skills. The first aim of this study was to validate a test battery to assess the construct of stability skills, in children aged 6 to 10 (M age = 8.2, SD = 1.2). Secondly we assessed how the stability skills construct fitted into a model of fundamental movement skill. The Delphi method was used to select the stability skill battery. Confirmatory factor analysis (CFA) was used to assess if the skills loaded onto the same construct and a new model of FMS was developed using structural equation modelling. Three postural control tasks were selected (the log roll, rock and back support) because they had good face and content validity. These skills also demonstrated good predictive validity with gymnasts scoring significantly better than children without gymnastic training and children from a high SES school performing better than those from a mid and low SES schools and the mid SES children scored better than the low SES children (all p < .05). Inter rater reliability tests were excellent for all three skills (ICC = 0.81, 0.87, 0.87) as was test re-test reliability (ICC 0.87-0.95). CFA provided good construct validity, and structural equation modelling revealed stability skills to be an independent factor in an overall FMS model which included locomotor (r = .88), object control (r = .76) and stability skills (r = .81). This study provides a rationale for the inclusion of stability skills in FMS assessment. The stability skills could be used alongside other FMS assessment tools to provide a holistic assessment of children's fundamental movement skills.

  3. Cuyahoga River, Ohio Restoration Study. Third Interim Preliminary Feasibility Report on Erosion and Sedimentation. Volume II. Appendices A through H.

    DTIC Science & Technology

    1981-04-01

    land occurs in undeveloped areas of the parks systems* Woodland. Land that is primarily used to produce adapted wood crops and to provide tree cover for...Filling & Dumping Area 4 26-2 Contruction Area 30 26-3 Construction Area 2 27-1 Construction Area 4 27-2 Construction Area 40 27-3 Construction Area 5 A...or white pine. The cost of this BMP is approximately $150 per acre. C-61 Wood laiid l iprovteintit. Thh; IIP il volves selectIve thinning of maple

  4. A variational approach to niche construction.

    PubMed

    Constant, Axel; Ramstead, Maxwell J D; Veissière, Samuel P L; Campbell, John O; Friston, Karl J

    2018-04-01

    In evolutionary biology, niche construction is sometimes described as a genuine evolutionary process whereby organisms, through their activities and regulatory mechanisms, modify their environment such as to steer their own evolutionary trajectory, and that of other species. There is ongoing debate, however, on the extent to which niche construction ought to be considered a bona fide evolutionary force, on a par with natural selection. Recent formulations of the variational free-energy principle as applied to the life sciences describe the properties of living systems, and their selection in evolution, in terms of variational inference. We argue that niche construction can be described using a variational approach. We propose new arguments to support the niche construction perspective, and to extend the variational approach to niche construction to current perspectives in various scientific fields. © 2018 The Authors.

  5. A variational approach to niche construction

    PubMed Central

    Ramstead, Maxwell J. D.; Veissière, Samuel P. L.; Campbell, John O.; Friston, Karl J.

    2018-01-01

    In evolutionary biology, niche construction is sometimes described as a genuine evolutionary process whereby organisms, through their activities and regulatory mechanisms, modify their environment such as to steer their own evolutionary trajectory, and that of other species. There is ongoing debate, however, on the extent to which niche construction ought to be considered a bona fide evolutionary force, on a par with natural selection. Recent formulations of the variational free-energy principle as applied to the life sciences describe the properties of living systems, and their selection in evolution, in terms of variational inference. We argue that niche construction can be described using a variational approach. We propose new arguments to support the niche construction perspective, and to extend the variational approach to niche construction to current perspectives in various scientific fields. PMID:29643221

  6. Use of the λ Red-recombineering method for genetic engineering of Pantoea ananatis

    PubMed Central

    Katashkina, Joanna I; Hara, Yoshihiko; Golubeva, Lyubov I; Andreeva, Irina G; Kuvaeva, Tatiana M; Mashko, Sergey V

    2009-01-01

    Background Pantoea ananatis, a member of the Enterobacteriacea family, is a new and promising subject for biotechnological research. Over recent years, impressive progress in its application to L-glutamate production has been achieved. Nevertheless, genetic and biotechnological studies of Pantoea ananatis have been impeded because of the absence of genetic tools for rapid construction of direct mutations in this bacterium. The λ Red-recombineering technique previously developed in E. coli and used for gene inactivation in several other bacteria is a high-performance tool for rapid construction of precise genome modifications. Results In this study, the expression of λ Red genes in P. ananatis was found to be highly toxic. A screening was performed to select mutants of P. ananatis that were resistant to the toxic affects of λ Red. A mutant strain, SC17(0) was identified that grew well under conditions of simultaneous expression of λ gam, bet, and exo genes. Using this strain, procedures for fast introduction of multiple rearrangements to the Pantoea ananatis genome based on the λ Red-dependent integration of the PCR-generated DNA fragments with as short as 40 bp flanking homologies have been demonstrated. Conclusion The λ Red-recombineering technology was successfully used for rapid generation of chromosomal modifications in the specially selected P. ananatis recipient strain. The procedure of electro-transformation with chromosomal DNA has been developed for transfer of the marked mutation between different P. ananatis strains. Combination of these techniques with λ Int/Xis-dependent excision of selective markers significantly accelerates basic research and construction of producing strains. PMID:19389224

  7. Microbial Consortium with High Cellulolytic Activity (MCHCA) for Enhanced Biogas Production

    PubMed Central

    Poszytek, Krzysztof; Ciezkowska, Martyna; Sklodowska, Aleksandra; Drewniak, Lukasz

    2016-01-01

    The use of lignocellulosic biomass as a substrate in agricultural biogas plants is very popular and yields good results. However, the efficiency of anaerobic digestion, and thus biogas production, is not always satisfactory due to the slow or incomplete degradation (hydrolysis) of plant matter. To enhance the solubilization of the lignocellulosic biomass various physical, chemical and biological pretreatment methods are used. The aim of this study was to select and characterize cellulose-degrading bacteria, and to construct a microbial consortium, dedicated for degradation of maize silage and enhancing biogas production from this substrate. Over 100 strains of cellulose-degrading bacteria were isolated from: sewage sludge, hydrolyzer from an agricultural biogas plant, cattle slurry and manure. After physiological characterization of the isolates, 16 strains (representatives of Bacillus, Providencia, and Ochrobactrum genera) were chosen for the construction of a Microbial Consortium with High Cellulolytic Activity, called MCHCA. The selected strains had a high endoglucanase activity (exceeding 0.21 IU/mL CMCase activity) and a wide range of tolerance to various physical and chemical conditions. Lab-scale simulation of biogas production using the selected strains for degradation of maize silage was carried out in a two-bioreactor system, similar to those used in agricultural biogas plants. The obtained results showed that the constructed MCHCA consortium is capable of efficient hydrolysis of maize silage, and increases biogas production by even 38%, depending on the inoculum used for methane fermentation. The results in this work indicate that the mesophilic MCHCA has a great potential for application on industrial scale in agricultural biogas plants. PMID:27014244

  8. Psychosocial Adaptation to Disability Within the Context of Positive Psychology: Findings from the Literature.

    PubMed

    Martz, Erin; Livneh, Hanoch

    2016-03-01

    This purpose of this article is to review of the trends of research that examined positive psychology constructs in the context of adapting to chronic illness and disability (CID). This article examines the empirical findings on the relationships between six selected positive psychology-associated constructs (optimism, hope, resilience, benefit-finding, meaning-making, and post-traumatic growth) and adaptation to disability. Six positive psychology constructs were selected to represent the trends found in recent literature published on CID. The process of choosing these six variables included reviewing chapters on positive psychology and CID, reviewing the top rehabilitation journals that typically publish articles on psychosocial adaptation to CID, using search engines to find relevant journal articles published since the year 2000, and selecting the most important constructs based on the authors’ professional judgment. The available evidence supports the unique benefits of these six positive psychology constructs in predicting successful adaptation to a range of disabling conditions. Based on the available findings, the authors offer four suggestions for occupational rehabilitation researchers.

  9. Simple construct evaluation with latent class analysis: An investigation of Facebook addiction and the development of a short form of the Facebook Addiction Test (F-AT).

    PubMed

    Dantlgraber, Michael; Wetzel, Eunike; Schützenberger, Petra; Stieger, Stefan; Reips, Ulf-Dietrich

    2016-09-01

    In psychological research, there is a growing interest in using latent class analysis (LCA) for the investigation of quantitative constructs. The aim of this study is to illustrate how LCA can be applied to gain insights on a construct and to select items during test development. We show the added benefits of LCA beyond factor-analytic methods, namely being able (1) to describe groups of participants that differ in their response patterns, (2) to determine appropriate cutoff values, (3) to evaluate items, and (4) to evaluate the relative importance of correlated factors. As an example, we investigated the construct of Facebook addiction using the Facebook Addiction Test (F-AT), an adapted version of the Internet Addiction Test (I-AT). Applying LCA facilitates the development of new tests and short forms of established tests. We present a short form of the F-AT based on the LCA results and validate the LCA approach and the short F-AT with several external criteria, such as chatting, reading newsfeeds, and posting status updates. Finally, we discuss the benefits of LCA for evaluating quantitative constructs in psychological research.

  10. Ion Trapping with Fast-Response Ion-Selective Microelectrodes Enhances Detection of Extracellular Ion Channel Gradients

    PubMed Central

    Messerli, Mark A.; Collis, Leon P.; Smith, Peter J.S.

    2009-01-01

    Previously, functional mapping of channels has been achieved by measuring the passage of net charge and of specific ions with electrophysiological and intracellular fluorescence imaging techniques. However, functional mapping of ion channels using extracellular ion-selective microelectrodes has distinct advantages over the former methods. We have developed this method through measurement of extracellular K+ gradients caused by efflux through Ca2+-activated K+ channels expressed in Chinese hamster ovary cells. We report that electrodes constructed with short columns of a mechanically stable K+-selective liquid membrane respond quickly and measure changes in local [K+] consistent with a diffusion model. When used in close proximity to the plasma membrane (<4 μm), the ISMs pose a barrier to simple diffusion, creating an ion trap. The ion trap amplifies the local change in [K+] without dramatically changing the rise or fall time of the [K+] profile. Measurement of extracellular K+ gradients from activated rSlo channels shows that rapid events, 10–55 ms, can be characterized. This method provides a noninvasive means for functional mapping of channel location and density as well as for characterizing the properties of ion channels in the plasma membrane. PMID:19217875

  11. Design and Construction of Airport Pavements on Expansive Soils

    DTIC Science & Technology

    1976-06-01

    Selection of the type anc amount of stabilizing agent (lime, cement , asphalt, only) (4) Test methods to determine the physical properties of sta...7 8.3 5.4 3.3 6.5 1 4.7 3-3, 1 (8) Investigate the effect of sulfate on cement -stabilized soils and establish...terested because the properties of soil/ cement mixtures and the relationships existing among these properties and various test values are discussed

  12. VizieR Online Data Catalog: Luminosity and redshift of galaxies from WISE/SDSS (Toba+, 2014)

    NASA Astrophysics Data System (ADS)

    Toba, Y.; Oyabu, S.; Matsuhara, H.; Malkan, M. A.; Gandhi, P.; Nakagawa, T.; Isobe, N.; Shirahata, M.; Oi, N.; Ohyama, Y.; Takita, S.; Yamauchi, C.; Yano, K.

    2017-07-01

    We selected 12 and 22 um flux-limited galaxies based on the WISE (Cat. II/311) and SDSS (Cat. II/294) catalogs, and these galaxies were then classified into five types according to their optical spectroscopic information in the SDSS catalog. For spectroscopically classified galaxies, we constructed the luminosity functions using the 1/Vmax method, considering the detection limit of the WISE and SDSS catalogs. (1 data file).

  13. Strategies for the construction and use of peptide and antibody libraries displayed on phages.

    PubMed

    Pini, Alessandro; Giuliani, Andrea; Ricci, Claudia; Runci, Ylenia; Bracci, Luisa

    2004-12-01

    Combinatorial chemistry and biology have become popular methods for the identification of bio-active molecules in drug discovery. A widely used technique in combinatorial biology is "phage display", by which peptides, antibody fragments and enzymes are displayed on the surface of bacteriophages, and can be selected by simple procedures of biopanning. The construction of phage libraries of peptides or antibody fragments provides a huge source of ligands and bio-active molecules that can be isolated from the library without laborious studies on antigen characteristics and prediction of ligand structure. This "irrational" approach for the construction of new drugs is extremely rapid and is now used by thousands of laboratories world-wide. The bottleneck in this procedure is the availability of large reliable libraries that can be used repeatedly over the years without loss of ligand expression and diversity. Construction of personalized libraries is therefore important for public and private laboratories engaged in the isolation of specific molecules for therapeutic or diagnostic use. Here we report the general strategies for constructing large phage peptide and antibody libraries, based on the experience of researchers who built the world's most widely used libraries. Particular attention is paid to advanced strategies for the construction, preservation and panning.

  14. Formation of integrated structural units using the systematic and integrated method when implementing high-rise construction projects

    NASA Astrophysics Data System (ADS)

    Abramov, Ivan

    2018-03-01

    Development of design documentation for a future construction project gives rise to a number of issues with the main one being selection of manpower for structural units of the project's overall implementation system. Well planned and competently staffed integrated structural construction units will help achieve a high level of reliability and labor productivity and avoid negative (extraordinary) situations during the construction period eventually ensuring improved project performance. Research priorities include the development of theoretical recommendations for enhancing reliability of a structural unit staffed as an integrated construction crew. The author focuses on identification of destabilizing factors affecting formation of an integrated construction crew; assessment of these destabilizing factors; based on the developed mathematical model, highlighting the impact of these factors on the integration criterion with subsequent identification of an efficiency and reliability criterion for the structural unit in general. The purpose of this article is to develop theoretical recommendations and scientific and methodological provisions of an organizational and technological nature in order to identify a reliability criterion for a structural unit based on manpower integration and productivity criteria. With this purpose in mind, complex scientific tasks have been defined requiring special research, development of corresponding provisions and recommendations based on the system analysis findings presented herein.

  15. Capture-SELEX: Selection of DNA Aptamers for Aminoglycoside Antibiotics

    PubMed Central

    2012-01-01

    Small organic molecules are challenging targets for an aptamer selection using the SELEX technology (SELEX—Systematic Evolution of Ligans by EXponential enrichment). Often they are not suitable for immobilization on solid surfaces, which is a common procedure in known aptamer selection methods. The Capture-SELEX procedure allows the selection of DNA aptamers for solute targets. A special SELEX library was constructed with the aim to immobilize this library on magnetic beads or other surfaces. For this purpose a docking sequence was incorporated into the random region of the library enabling hybridization to a complementary oligo fixed on magnetic beads. Oligonucleotides of the library which exhibit high affinity to the target and a secondary structure fitting to the target are released from the beads for binding to the target during the aptamer selection process. The oligonucleotides of these binding complexes were amplified, purified, and immobilized via the docking sequence to the magnetic beads as the starting point of the following selection round. Based on this Capture-SELEX procedure, the successful DNA aptamer selection for the aminoglycoside antibiotic kanamycin A as a small molecule target is described. PMID:23326761

  16. Automated selection of trabecular bone regions in knee radiographs.

    PubMed

    Podsiadlo, P; Wolski, M; Stachowiak, G W

    2008-05-01

    Osteoarthritic (OA) changes in knee joints can be assessed by analyzing the structure of trabecular bone (TB) in the tibia. This analysis is performed on TB regions selected manually by a human operator on x-ray images. Manual selection is time-consuming, tedious, and expensive. Even if a radiologist expert or highly trained person is available to select regions, high inter- and intraobserver variabilities are still possible. A fully automated image segmentation method was, therefore, developed to select the bone regions for numerical analyses of changes in bone structures. The newly developed method consists of image preprocessing, delineation of cortical bone plates (active shape model), and location of regions of interest (ROI). The method was trained on an independent set of 40 x-ray images. Automatically selected regions were compared to the "gold standard" that contains ROIs selected manually by a radiologist expert on 132 x-ray images. All images were acquired from subjects locked in a standardized standing position using a radiography rig. The size of each ROI is 12.8 x 12.8 mm. The automated method results showed a good agreement with the gold standard [similarity index (SI) = 0.83 (medial) and 0.81 (lateral) and the offset =[-1.78, 1.27]x[-0.65,0.26] mm (medial) and [-2.15, 1.59]x[-0.58, 0.52] mm (lateral)]. Bland and Altman plots were constructed for fractal signatures, and changes of fractal dimensions (FD) to region offsets calculated between the gold standard and automatically selected regions were calculated. The plots showed a random scatter and the 95% confidence intervals were (-0.006, 0.008) and (-0.001, 0.011). The changes of FDs to region offsets were less than 0.035. Previous studies showed that differences in FDs between non-OA and OA bone regions were greater than 0.05. ROIs were also selected by a second radiologist and then evaluated. Results indicated that the newly developed method could replace a human operator and produces bone regions with an accuracy that is sufficient for fractal analyses of bone texture.

  17. Landslide risk assessment

    USGS Publications Warehouse

    Lessing, P.; Messina, C.P.; Fonner, R.F.

    1983-01-01

    Landslide risk can be assessed by evaluating geological conditions associated with past events. A sample of 2,4 16 slides from urban areas in West Virginia, each with 12 associated geological factors, has been analyzed using SAS computer methods. In addition, selected data have been normalized to account for areal distribution of rock formations, soil series, and slope percents. Final calculations yield landslide risk assessments of 1.50=high risk. The simplicity of the method provides for a rapid, initial assessment prior to financial investment. However, it does not replace on-site investigations, nor excuse poor construction. ?? 1983 Springer-Verlag New York Inc.

  18. Usage of noncontact human body measurements for development of Army Work Wear Trousers

    NASA Astrophysics Data System (ADS)

    Dabolina, Inga; Lapkovska, Eva; Vilumsone, Ausma

    2017-10-01

    The paper is based on issues related to imperfections of clothing fit, garment construction solutions and control measurement systems of finished products, which were identified in the research process analysing army soldier work wear trousers. The aim is to obtain target group body measurements using noncontact anthropometrical data acquisition method (3D scanning) for selection and analysis of scanned data suitable for trouser design. Tasks include comparison of scanned data with manually taken body measurements and different corresponding human body measurement standard data for establishing potential advantages of noncontact method usage in solving different trouser design issues.

  19. Diagnostic analysis of liver B ultrasonic texture features based on LM neural network

    NASA Astrophysics Data System (ADS)

    Chi, Qingyun; Hua, Hu; Liu, Menglin; Jiang, Xiuying

    2017-03-01

    In this study, B ultrasound images of 124 benign and malignant patients were randomly selected as the study objects. The B ultrasound images of the liver were treated by enhanced de-noising. By constructing the gray level co-occurrence matrix which reflects the information of each angle, Principal Component Analysis of 22 texture features were extracted and combined with LM neural network for diagnosis and classification. Experimental results show that this method is a rapid and effective diagnostic method for liver imaging, which provides a quantitative basis for clinical diagnosis of liver diseases.

  20. Stereoselective heterocycle synthesis through oxidative carbon-hydrogen bond activation.

    PubMed

    Liu, Lei; Floreancig, Paul E

    2010-01-01

    Heterocycles are ubiquitous structures in both drugs and natural products, and efficient methods for their construction are being pursued constantly. Carbon-hydrogen bond activation offers numerous advantages for the synthesis of heterocycles with respect to minimizing the length of synthetic routes and reducing waste. As interest in chiral medicinal leads increases, stereoselective methods for heterocycle synthesis must be developed. The use of carbon-hydrogen bond activation reactions for stereoselective heterocycle synthesis has produced a range of creative transformations that provide a wide array of structural motifs, selected examples of which are described in this review.

  1. A rough set approach for determining weights of decision makers in group decision making

    PubMed Central

    Yang, Qiang; Du, Ping-an; Wang, Yong; Liang, Bin

    2017-01-01

    This study aims to present a novel approach for determining the weights of decision makers (DMs) based on rough group decision in multiple attribute group decision-making (MAGDM) problems. First, we construct a rough group decision matrix from all DMs’ decision matrixes on the basis of rough set theory. After that, we derive a positive ideal solution (PIS) founded on the average matrix of rough group decision, and negative ideal solutions (NISs) founded on the lower and upper limit matrixes of rough group decision. Then, we obtain the weight of each group member and priority order of alternatives by using relative closeness method, which depends on the distances from each individual group member’ decision to the PIS and NISs. Through comparisons with existing methods and an on-line business manager selection example, the proposed method show that it can provide more insights into the subjectivity and vagueness of DMs’ evaluations and selections. PMID:28234974

  2. Wavelet neural networks: a practical guide.

    PubMed

    Alexandridis, Antonios K; Zapranis, Achilleas D

    2013-06-01

    Wavelet networks (WNs) are a new class of networks which have been used with great success in a wide range of applications. However a general accepted framework for applying WNs is missing from the literature. In this study, we present a complete statistical model identification framework in order to apply WNs in various applications. The following subjects were thoroughly examined: the structure of a WN, training methods, initialization algorithms, variable significance and variable selection algorithms, model selection methods and finally methods to construct confidence and prediction intervals. In addition the complexity of each algorithm is discussed. Our proposed framework was tested in two simulated cases, in one chaotic time series described by the Mackey-Glass equation and in three real datasets described by daily temperatures in Berlin, daily wind speeds in New York and breast cancer classification. Our results have shown that the proposed algorithms produce stable and robust results indicating that our proposed framework can be applied in various applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Multilevel regularized regression for simultaneous taxa selection and network construction with metagenomic count data.

    PubMed

    Liu, Zhenqiu; Sun, Fengzhu; Braun, Jonathan; McGovern, Dermot P B; Piantadosi, Steven

    2015-04-01

    Identifying disease associated taxa and constructing networks for bacteria interactions are two important tasks usually studied separately. In reality, differentiation of disease associated taxa and correlation among taxa may affect each other. One genus can be differentiated because it is highly correlated with another highly differentiated one. In addition, network structures may vary under different clinical conditions. Permutation tests are commonly used to detect differences between networks in distinct phenotypes, and they are time-consuming. In this manuscript, we propose a multilevel regularized regression method to simultaneously identify taxa and construct networks. We also extend the framework to allow construction of a common network and differentiated network together. An efficient algorithm with dual formulation is developed to deal with the large-scale n ≪ m problem with a large number of taxa (m) and a small number of samples (n) efficiently. The proposed method is regularized with a general Lp (p ∈ [0, 2]) penalty and models the effects of taxa abundance differentiation and correlation jointly. We demonstrate that it can identify both true and biologically significant genera and network structures. Software MLRR in MATLAB is available at http://biostatistics.csmc.edu/mlrr/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Construction of regulatory networks using expression time-series data of a genotyped population.

    PubMed

    Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E

    2011-11-29

    The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.

  5. Simulation of Constructed Wetland in treating Wastewater using Fuzzy Logic Technique

    NASA Astrophysics Data System (ADS)

    Sudarsan, J. S.; Subramani, Sheekha; Rajan, Rajitha J.; Shah, Isha; Nithiyanantham, S.

    2018-04-01

    Constructed wetlands act as a natural alternative to conventional methods of wastewater treatment. CW are found effective in wastewater containing inorganic matter, organic matter, toxic compounds, metals, nitrogen, phosphorous, heavy metals, organic chemicals, and pathogens. The treatment efficiency by the adaptation of CWs in treatment process is achieved by a complex interaction between plants, microorganisms, soil matrix and substances in the wastewater. Constructed wetland treatment systems are engineered systems designed in such a manner that it could take advantages of those processes occurring in natural wetlands in treating the wastewater concerned, but in a more controlled environment. Petrochemical wastewater was the type of wastewater taken for the study. Characteristics of petrochemical wastewater mainly oil, Biological Oxygen Demand (BOD) and Chemical oxygen demand (COD) were selected for treatment in constructed wetland as they are predominant in petrochemical wastewater. The conventional methods followed in the treatment are chemical and biological treatment. In this study, a fuzzy model for water quality assessment has been developed and water quality index value was obtained. The experiment conducted and further analysis using fuzzy logic indicated that interpretation of certain imprecise data can be improved within fuzzy inference system (FIS). Based on the analysis, we could observe that Typha sp contained wetland cell showed greater efficiency in removal of parameters such as COD and BOD than Phragmites sp. wetland cell.

  6. Role to Be Played by Independent Geotechnical Supervision in the Foundation for Bridge Construction

    NASA Astrophysics Data System (ADS)

    Sobala, Dariusz; Rybak, Jarosław

    2017-10-01

    Some remarks concerning the necessity of employing an independent and over all ethical geotechnical survey were presented in the paper. Starting from the design phase, through the whole construction process, the importance of geotechnical engineer is stated in legal acts. Numerous testing technologies serve for the calibration of geotechnical technologies and allow for confirming the quality and capacity of piles. Special emphasis was payed to the involvement of scientifical and research institutions which can not only serve services but also can postprocess and methodize collected data. Such databases enable for new codes, methods and recommendations. Selection of deep foundations for bridge-type structures is most often dependent on complex geotechnical conditions, concentrated loads and constraints for pier displacements. Besides the last ones, prior to more common introduction of the design-construct system, could be a convenient justification for design engineer, who imposed deep foundation because he did not want or was not able to estimate the effect of pier settlement on civil engineering structure. The paper provides some notes about the need to engage a geotechnical supervising service of high competency and ethical quality during engineering and construction stages of foundations for bridge-type structures where legal requirements are of special consideration. Successive stages of projects are reviewed and research methods used for current calibration of geotechnical technologies and verification of geotechnical work quality are analysed. Special attention is given to potential involvement of independent R&D institutions which, apart from rendering specific services, also collect and systemize the research results thus enabling, in the long term, to revise engineering standards, instructions and guidelines.

  7. Multivariate Hermite interpolation on scattered point sets using tensor-product expo-rational B-splines

    NASA Astrophysics Data System (ADS)

    Dechevsky, Lubomir T.; Bang, Børre; Laksa˚, Arne; Zanaty, Peter

    2011-12-01

    At the Seventh International Conference on Mathematical Methods for Curves and Surfaces, To/nsberg, Norway, in 2008, several new constructions for Hermite interpolation on scattered point sets in domains in Rn,n∈N, combined with smooth convex partition of unity for several general types of partitions of these domains were proposed in [1]. All of these constructions were based on a new type of B-splines, proposed by some of the authors several years earlier: expo-rational B-splines (ERBS) [3]. In the present communication we shall provide more details about one of these constructions: the one for the most general class of domain partitions considered. This construction is based on the use of two separate families of basis functions: one which has all the necessary Hermite interpolation properties, and another which has the necessary properties of a smooth convex partition of unity. The constructions of both of these two bases are well-known; the new part of the construction is the combined use of these bases for the derivation of a new basis which enjoys having all above-said interpolation and unity partition properties simultaneously. In [1] the emphasis was put on the use of radial basis functions in the definitions of the two initial bases in the construction; now we shall put the main emphasis on the case when these bases consist of tensor-product B-splines. This selection provides two useful advantages: (A) it is easier to compute higher-order derivatives while working in Cartesian coordinates; (B) it becomes clear that this construction becomes a far-going extension of tensor-product constructions. We shall provide 3-dimensional visualization of the resulting bivariate bases, using tensor-product ERBS. In the main tensor-product variant, we shall consider also replacement of ERBS with simpler generalized ERBS (GERBS) [2], namely, their simplified polynomial modifications: the Euler Beta-function B-splines (BFBS). One advantage of using BFBS instead of ERBS is the simplified computation, since BFBS are piecewise polynomial, which ERBS are not. One disadvantage of using BFBS in the place of ERBS in this construction is that the necessary selection of the degree of BFBS imposes constraints on the maximal possible multiplicity of the Hermite interpolation.

  8. Creating and validating an instrument to identify the workload at an oncology and hematology outpatient service

    PubMed Central

    Martin, Lelia Gonçalves Rocha; Gaidzinski, Raquel Rapone

    2014-01-01

    Objective Construct and to validate an instrument for measuring the time spent by nursing staff in the interventions/activities in Outpatient Oncology and Hematology, interventions based on Nursing Interventions Classification (NIC), for key areas of Pediatric Oncology and Oncology Nursing. Methods Cross-sectional study divided into two steps: (1) construction of an instrument to measure the interventions/Nursing activities and (2) validation of this instrument. Results We selected 32 essential interventions from NIC for Pediatric Oncology and Oncology Nursing areas. The judges agreed with removing 13 and including 6 interventions in the instrument, beyond personal activity. Conclusion The choice of essential interventions from NIC is justified by the gain time on research. PMID:25295454

  9. Research on Evaluation of resource allocation efficiency of transportation system based on DEA

    NASA Astrophysics Data System (ADS)

    Zhang, Zhehui; Du, Linan

    2017-06-01

    In this paper, we select the time series data onto 1985-2015 years, construct the land (shoreline) resources, capital and labor as inputs. The index system of the output is freight volume and passenger volume, we use Quantitative analysis based on DEA method evaluated the resource allocation efficiency of railway, highway, water transport and civil aviation in China. Research shows that the resource allocation efficiency of various modes of transport has obvious difference, and the impact on scale efficiency is more significant. The most important two ways to optimize the allocation of resources to improve the efficiency of the combination of various modes of transport is promoting the co-ordination of various modes of transport and constructing integrated transportation system.

  10. From Never Born Proteins to Minimal Living Cells: two projects in synthetic biology.

    PubMed

    Luisi, Pier Luigi; Chiarabelli, Cristiano; Stano, Pasquale

    2006-12-01

    The Never Born Proteins (NBPs) and the Minimal Cell projects are two currently developed research lines belonging to the field of synthetic biology. The first deals with the investigation of structural and functional properties of de novo proteins with random sequences, selected and isolated using phage display methods. The minimal cell is the simplest cellular construct which displays living properties, such as self-maintenance, self-reproduction and evolvability. The semi-synthetic approach to minimal cells involves the use of extant genes and proteins in order to build a supramolecular construct based on lipid vesicles. Results and outlooks on these two research lines are shortly discussed, mainly focusing on their relevance to the origin of life studies.

  11. Encapsulation of a Decision-Making Model to Optimize Supplier Selection via Structural Equation Modeling (SEM)

    NASA Astrophysics Data System (ADS)

    Sahul Hameed, Ruzanna; Thiruchelvam, Sivadass; Nasharuddin Mustapha, Kamal; Che Muda, Zakaria; Mat Husin, Norhayati; Ezanee Rusli, Mohd; Yong, Lee Choon; Ghazali, Azrul; Itam, Zarina; Hakimie, Hazlinda; Beddu, Salmia; Liyana Mohd Kamal, Nur

    2016-03-01

    This paper proposes a conceptual framework to compare criteria/factor that influence the supplier selection. A mixed methods approach comprising qualitative and quantitative survey will be used. The study intend to identify and define the metrics that key stakeholders at Public Works Department (PWD) believed should be used for supplier. The outcomes would foresee the possible initiatives to bring procurement in PWD to a strategic level. The results will provide a deeper understanding of drivers for supplier’s selection in the construction industry. The obtained output will benefit many parties involved in the supplier selection decision-making. The findings provides useful information and greater understanding of the perceptions that PWD executives hold regarding supplier selection and the extent to which these perceptions are consistent with findings from prior studies. The findings from this paper can be utilized as input for policy makers to outline any changes in the current procurement code of practice in order to enhance the degree of transparency and integrity in decision-making.

  12. Optimization of the scheme for natural ecology planning of urban rivers based on ANP (analytic network process) model.

    PubMed

    Zhang, Yichuan; Wang, Jiangping

    2015-07-01

    Rivers serve as a highly valued component in ecosystem and urban infrastructures. River planning should follow basic principles of maintaining or reconstructing the natural landscape and ecological functions of rivers. Optimization of planning scheme is a prerequisite for successful construction of urban rivers. Therefore, relevant studies on optimization of scheme for natural ecology planning of rivers is crucial. In the present study, four planning schemes for Zhaodingpal River in Xinxiang City, Henan Province were included as the objects for optimization. Fourteen factors that influenced the natural ecology planning of urban rivers were selected from five aspects so as to establish the ANP model. The data processing was done using Super Decisions software. The results showed that important degree of scheme 3 was highest. A scientific, reasonable and accurate evaluation of schemes could be made by ANP method on natural ecology planning of urban rivers. This method could be used to provide references for sustainable development and construction of urban rivers. ANP method is also suitable for optimization of schemes for urban green space planning and design.

  13. Development of an Improved Mammalian Overexpression Method for Human CD62L

    PubMed Central

    Brown, Haley A.; Roth, Gwynne; Holzapfel, Genevieve; Shen, Sarek; Rahbari, Kate; Ireland, Joanna; Zou, Zhongcheng; Sun, Peter D.

    2014-01-01

    We have previously developed a glutamine synthetase (GS)-based mammalian recombinant protein expression system that is capable of producing 5 to 30 mg/L recombinant proteins. The over expression is based on multiple rounds of target gene amplification driven by methionine sulfoximine (MSX), an inhibitor of glutamine synthetase. However, like other stable mammalian over expression systems, a major shortcoming of the GS-based expression system is its lengthy turn-around time, typically taking 4–6 months to produce. To shorten the construction time, we replaced the muti-round target gene amplifications with single-round in situ amplifications, thereby shortening the cell line construction to 2 months. The single-round in situ amplification method resulted in highest recombinant CD62L expressing CHO cell lines producing ~5mg/L soluble CD62L, similar to those derived from the multi-round amplification and selection method. In addition, we developed a MSX resistance assay as an alternative to utilizing ELISA for evaluating the expression level of stable recombinant CHO cell lines. PMID:25286402

  14. Extraction of features from ultrasound acoustic emissions: a tool to assess the hydraulic vulnerability of Norway spruce trunkwood?

    PubMed Central

    Rosner, Sabine; Klein, Andrea; Wimmer, Rupert; Karlsson, Bo

    2011-01-01

    Summary • The aim of this study was to assess the hydraulic vulnerability of Norway spruce (Picea abies) trunkwood by extraction of selected features of acoustic emissions (AEs) detected during dehydration of standard size samples. • The hydraulic method was used as the reference method to assess the hydraulic vulnerability of trunkwood of different cambial ages. Vulnerability curves were constructed by plotting the percentage loss of conductivity vs an overpressure of compressed air. • Differences in hydraulic vulnerability were very pronounced between juvenile and mature wood samples; therefore, useful AE features, such as peak amplitude, duration and relative energy, could be filtered out. The AE rates of signals clustered by amplitude and duration ranges and the AE energies differed greatly between juvenile and mature wood at identical relative water losses. • Vulnerability curves could be constructed by relating the cumulated amount of relative AE energy to the relative loss of water and to xylem tension. AE testing in combination with feature extraction offers a readily automated and easy to use alternative to the hydraulic method. PMID:16771986

  15. Mine Planning for Asteroid Orebodies

    NASA Astrophysics Data System (ADS)

    Gertsch, L. S.; Gertsch, R. E.

    2000-01-01

    Given that an asteroid (or comet) has been determined to contain sufficient material of value to be potentially economic to exploit, a mining method must be selected and implemented. This paper discusses the engineering necessary to bring a mine online, and the opportunities and challenges inherent in asteroid mineral prospects. The very important step of orebody characterization is discussed elsewhere. The mining methods discussed here are based on enclosing the asteroid within a bag in some fashion, whether completely or partially. In general, asteroid mining methods based on bags will consist of the following steps. Not all will be required in every case, nor necessarily in this particular sequence. Some steps will be performed simultaneously. Their purpose is to extract the valuable material from the body of the asteroid in the most efficient, cost-effective manner possible. In approximate order of initiation, if not of conclusion, the steps are: 1. Tether anchoring to the asteroid. 2. Asteroid motion control. 3. Body/fragment restraint system placement. 4. Operations platform construction. 5. Bag construction. 6. Auxiliary and support equipment placement. 7. Mining operations. 8. Processing operations. 9. Product transport to markets.

  16. High-Throughput Method for Ranking the Affinity of Peptide Ligands Selected from Phage Display Libraries

    PubMed Central

    González-Techera, A.; Umpiérrez-Failache, M.; Cardozo, S.; Obal, G.; Pritsch, O.; Last, J. A.; Gee, S. J.; Hammock, B. D.; González-Sapienza, G.

    2010-01-01

    The use of phage display peptide libraries allows rapid isolation of peptide ligands for any target selector molecule. However, due to differences in peptide expression and the heterogeneity of the phage preparations, there is no easy way to compare the binding properties of the selected clones, which operates as a major “bottleneck” of the technology. Here, we present the development of a new type of library that allows rapid comparison of the relative affinity of the selected peptides in a high-throughput screening format. As a model system, a phage display peptide library constructed on a phagemid vector that contains the bacterial alkaline phosphatase gene (BAP) was selected with an antiherbicide antibody. Due to the intrinsic switching capacity of the library, the selected peptides were transferred “en masse” from the phage coat protein to BAP. This was coupled to an optimized affinity ELISA where normalized amounts of the peptide–BAP fusion allow direct comparison of the binding properties of hundreds of peptide ligands. The system was validated by plasmon surface resonance experiments using synthetic peptides, showing that the method discriminates among the affinities of the peptides within 3 orders of magnitude. In addition, the peptide–BAP protein can find direct application as a tracer reagent. PMID:18393454

  17. Primer Extension Mutagenesis Powered by Selective Rolling Circle Amplification

    PubMed Central

    Huovinen, Tuomas; Brockmann, Eeva-Christine; Akter, Sultana; Perez-Gamarra, Susan; Ylä-Pelto, Jani; Liu, Yuan; Lamminmäki, Urpo

    2012-01-01

    Primer extension mutagenesis is a popular tool to create libraries for in vitro evolution experiments. Here we describe a further improvement of the method described by T.A. Kunkel using uracil-containing single-stranded DNA as the template for the primer extension by additional uracil-DNA glycosylase treatment and rolling circle amplification (RCA) steps. It is shown that removal of uracil bases from the template leads to selective amplification of the nascently synthesized circular DNA strand carrying the desired mutations by phi29 DNA polymerase. Selective RCA (sRCA) of the DNA heteroduplex formed in Kunkel's mutagenesis increases the mutagenesis efficiency from 50% close to 100% and the number of transformants 300-fold without notable diversity bias. We also observed that both the mutated and the wild-type DNA were present in at least one third of the cells transformed directly with Kunkel's heteroduplex. In contrast, the cells transformed with sRCA product contained only mutated DNA. In sRCA, the complex cell-based selection for the mutant strand is replaced with the more controllable enzyme-based selection and less DNA is needed for library creation. Construction of a gene library of ten billion members is demonstrated with the described method with 240 nanograms of DNA as starting material. PMID:22355397

  18. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  19. Data selection techniques in the interpretation of MAGSAT data over Australia

    NASA Technical Reports Server (NTRS)

    Johnson, B. D.; Dampney, C. N. G.

    1983-01-01

    The MAGSAT data require critical selection in order to produce a self-consistent data set suitable for map construction and subsequent interpretation. Interactive data selection techniques are described which involve the use of a special-purpose profile-oriented data base and a colour graphics display. The careful application of these data selection techniques permits validation every data value and ensures that the best possible self-consistent data set is being used to construct the maps of the magnetic field measured at satellite altitudes over Australia.

  20. The use of lactic acid-producing, malic acid-producing, or malic acid-degrading yeast strains for acidity adjustment in the wine industry.

    PubMed

    Su, Jing; Wang, Tao; Wang, Yun; Li, Ying-Ying; Li, Hua

    2014-03-01

    In an era of economic globalization, the competition among wine businesses is likely to get tougher. Biotechnological innovation permeates the entire world and intensifies the severity of the competition of the wine industry. Moreover, modern consumers preferred individualized, tailored, and healthy and top quality wine products. Consequently, these two facts induce large gaps between wine production and wine consumption. Market-orientated yeast strains are presently being selected or developed for enhancing the core competitiveness of wine enterprises. Reasonable biological acidity is critical to warrant a high-quality wine. Many wild-type acidity adjustment yeast strains have been selected all over the world. Moreover, mutation breeding, metabolic engineering, genetic engineering, and protoplast fusion methods are used to construct new acidity adjustment yeast strains to meet the demands of the market. In this paper, strategies and concepts for strain selection or improvement methods were discussed, and many examples based upon selected studies involving acidity adjustment yeast strains were reviewed. Furthermore, the development of acidity adjustment yeast strains with minimized resource inputs, improved fermentation, and enological capabilities for an environmentally friendly production of healthy, top quality wine is presented.

  1. Effective formation method for an aspherical microlens array based on an aperiodic moving mask during exposure.

    PubMed

    Shi, Lifang; Du, Chunlei; Dong, Xiaochun; Deng, Qiling; Luo, Xiangang

    2007-12-01

    An aperiodic mask design method for fabricating a microlens array with an aspherical profile is proposed. The nonlinear relationship between exposure doses and lens profile is considered, and the select criteria of quantization interval and fabrication range of the method are given. The mask function of a quadrangle microlens array with a hyperboloid profile used in the infrared was constructed by using this method. The microlens array can be effectively fabricated during a one time exposure process using the mask. Reactive ion etching was carried out to transfer the structure into the substrate of germanium. The measurement results indicate that the roughness is less than 10 nm (pv), and the profile error is less than 40 nm (rms).

  2. An adaptive incremental approach to constructing ensemble classifiers: application in an information-theoretic computer-aided decision system for detection of masses in mammograms.

    PubMed

    Mazurowski, Maciej A; Zurada, Jacek M; Tourassi, Georgia D

    2009-07-01

    Ensemble classifiers have been shown efficient in multiple applications. In this article, the authors explore the effectiveness of ensemble classifiers in a case-based computer-aided diagnosis system for detection of masses in mammograms. They evaluate two general ways of constructing subclassifiers by resampling of the available development dataset: Random division and random selection. Furthermore, they discuss the problem of selecting the ensemble size and propose two adaptive incremental techniques that automatically select the size for the problem at hand. All the techniques are evaluated with respect to a previously proposed information-theoretic CAD system (IT-CAD). The experimental results show that the examined ensemble techniques provide a statistically significant improvement (AUC = 0.905 +/- 0.024) in performance as compared to the original IT-CAD system (AUC = 0.865 +/- 0.029). Some of the techniques allow for a notable reduction in the total number of examples stored in the case base (to 1.3% of the original size), which, in turn, results in lower storage requirements and a shorter response time of the system. Among the methods examined in this article, the two proposed adaptive techniques are by far the most effective for this purpose. Furthermore, the authors provide some discussion and guidance for choosing the ensemble parameters.

  3. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    PubMed

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  4. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis

    PubMed Central

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-01-01

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387

  5. A Novel Characteristic Frequency Bands Extraction Method for Automatic Bearing Fault Diagnosis Based on Hilbert Huang Transform

    PubMed Central

    Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li

    2015-01-01

    Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction. PMID:26540059

  6. A Novel Characteristic Frequency Bands Extraction Method for Automatic Bearing Fault Diagnosis Based on Hilbert Huang Transform.

    PubMed

    Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li

    2015-11-03

    Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert-Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500-800 and a m range of 50-300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction.

  7. Measuring organizational and individual factors thought to influence the success of quality improvement in primary care: a systematic review of instruments

    PubMed Central

    2012-01-01

    Background Continuous quality improvement (CQI) methods are widely used in healthcare; however, the effectiveness of the methods is variable, and evidence about the extent to which contextual and other factors modify effects is limited. Investigating the relationship between these factors and CQI outcomes poses challenges for those evaluating CQI, among the most complex of which relate to the measurement of modifying factors. We aimed to provide guidance to support the selection of measurement instruments by systematically collating, categorising, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments, reference lists of systematic reviews, and citations and references of the main report of instruments. Study selection: The scope of the review was determined by a conceptual framework developed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). Papers reporting development or use of an instrument measuring a construct encompassed by the framework were included. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarising and comparing instruments. Instrument content was categorised using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 186 potentially relevant instruments, 152 of which were analysed to develop the taxonomy. Eighty-four instruments measured constructs relevant to primary care, with content measuring CQI implementation and use (19 instruments), organizational context (51 instruments), and individual factors (21 instruments). Forty-one instruments were included for full review. Development methods were often pragmatic, rather than systematic and theory-based, and evidence supporting measurement properties was limited. Conclusions Many instruments are available for evaluating CQI, but most require further use and testing to establish their measurement properties. Further development and use of these measures in evaluations should increase the contribution made by individual studies to our understanding of CQI and enhance our ability to synthesise evidence for informing policy and practice. PMID:23241168

  8. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    NASA Astrophysics Data System (ADS)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived prediction interval for a selected hydrograph in the validation data set is presented in Fig 1. It is noted that most of the observed flows lie within the constructed prediction interval, and therefore provides information about the uncertainty of the prediction. One specific advantage of the method is that when ensemble mean value is considered as a forecast, the peak flows are predicted with improved accuracy by this method compared to traditional single point forecasted ANNs. Fig. 1 Prediction Interval for selected hydrograph

  9. COACH: profile-profile alignment of protein families using hidden Markov models.

    PubMed

    Edgar, Robert C; Sjölander, Kimmen

    2004-05-22

    Alignments of two multiple-sequence alignments, or statistical models of such alignments (profiles), have important applications in computational biology. The increased amount of information in a profile versus a single sequence can lead to more accurate alignments and more sensitive homolog detection in database searches. Several profile-profile alignment methods have been proposed and have been shown to improve sensitivity and alignment quality compared with sequence-sequence methods (such as BLAST) and profile-sequence methods (e.g. PSI-BLAST). Here we present a new approach to profile-profile alignment we call Comparison of Alignments by Constructing Hidden Markov Models (HMMs) (COACH). COACH aligns two multiple sequence alignments by constructing a profile HMM from one alignment and aligning the other to that HMM. We compare the alignment accuracy of COACH with two recently published methods: Yona and Levitt's prof_sim and Sadreyev and Grishin's COMPASS. On two sets of reference alignments selected from the FSSP database, we find that COACH is able, on average, to produce alignments giving the best coverage or the fewest errors, depending on the chosen parameter settings. COACH is freely available from www.drive5.com/lobster

  10. A novel ultra-performance liquid chromatography hyphenated with quadrupole time of flight mass spectrometry method for rapid estimation of total toxic retronecine-type of pyrrolizidine alkaloids in herbs without requiring corresponding standards.

    PubMed

    Zhu, Lin; Ruan, Jian-Qing; Li, Na; Fu, Peter P; Ye, Yang; Lin, Ge

    2016-03-01

    Nearly 50% of naturally-occurring pyrrolizidine alkaloids (PAs) are hepatotoxic, and the majority of hepatotoxic PAs are retronecine-type PAs (RET-PAs). However, quantitative measurement of PAs in herbs/foodstuffs is often difficult because most of reference PAs are unavailable. In this study, a rapid, selective, and sensitive UHPLC-QTOF-MS method was developed for the estimation of RET-PAs in herbs without requiring corresponding standards. This method is based on our previously established characteristic and diagnostic mass fragmentation patterns and the use of retrorsine for calibration. The use of a single RET-PA (i.e. retrorsine) for construction of calibration was based on high similarities with no significant differences demonstrated by the calibration curves constructed by peak areas of extract ion chromatograms of fragment ion at m/z 120.0813 or 138.0919 versus concentrations of five representative RET-PAs. The developed method was successfully applied to measure a total content of toxic RET-PAs of diversified structures in fifteen potential PA-containing herbs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. [Montessori method applied to dementia - literature review].

    PubMed

    Brandão, Daniela Filipa Soares; Martín, José Ignacio

    2012-06-01

    The Montessori method was initially applied to children, but now it has also been applied to people with dementia. The purpose of this study is to systematically review the research on the effectiveness of this method using Medical Literature Analysis and Retrieval System Online (Medline) with the keywords dementia and Montessori method. We selected lo studies, in which there were significant improvements in participation and constructive engagement, and reduction of negative affects and passive engagement. Nevertheless, systematic reviews about this non-pharmacological intervention in dementia rate this method as weak in terms of effectiveness. This apparent discrepancy can be explained because the Montessori method may have, in fact, a small influence on dimensions such as behavioral problems, or because there is no research about this method with high levels of control, such as the presence of several control groups or a double-blind study.

  12. Risk Decision Making Model for Reservoir Floodwater resources Utilization

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2017-12-01

    Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.

  13. Moire technique utilization for detection and measurement of scoliosis

    NASA Astrophysics Data System (ADS)

    Zawieska, Dorota; Podlasiak, Piotr

    1993-02-01

    Moire projection method enables non-contact measurement of the shape or deformation of different surfaces and constructions by fringe pattern analysis. The fringe map acquisition of the whole surface of the object under test is one of the main advantages compared with 'point by point' methods. The computer analyzes the shape of the whole surface and next user can selected different points or cross section of the object map. In this paper a few typical examples of an application of the moire technique in solving different medical problems will be presented. We will also present to you the equipment the moire pattern analysis is done in real time using the phase stepping method with CCD camera.

  14. An investigation of dynamic-analysis methods for variable-geometry structures

    NASA Technical Reports Server (NTRS)

    Austin, F.

    1980-01-01

    Selected space structure configurations were reviewed in order to define dynamic analysis problems associated with variable geometry. The dynamics of a beam being constructed from a flexible base and the relocation of the completed beam by rotating the remote manipulator system about the shoulder joint were selected. Equations of motion were formulated in physical coordinates for both of these problems, and FORTRAN programs were developed to generate solutions by numerically integrating the equations. These solutions served as a standard of comparison to gauge the accuracy of approximate solution techniques that were developed and studied. Good control was achieved in both problems. Unstable control system coupling with the system flexibility did not occur. An approximate method was developed for each problem to enable the analyst to investigate variable geometry effects during a short time span using standard fixed geometry programs such as NASTRAN. The average angle and average length techniques are discussed.

  15. Predicting Response to Neoadjuvant Chemoradiotherapy in Esophageal Cancer with Textural Features Derived from Pretreatment 18F-FDG PET/CT Imaging.

    PubMed

    Beukinga, Roelof J; Hulshoff, Jan B; van Dijk, Lisanne V; Muijs, Christina T; Burgerhof, Johannes G M; Kats-Ugurlu, Gursah; Slart, Riemer H J A; Slump, Cornelis H; Mul, Véronique E M; Plukker, John Th M

    2017-05-01

    Adequate prediction of tumor response to neoadjuvant chemoradiotherapy (nCRT) in esophageal cancer (EC) patients is important in a more personalized treatment. The current best clinical method to predict pathologic complete response is SUV max in 18 F-FDG PET/CT imaging. To improve the prediction of response, we constructed a model to predict complete response to nCRT in EC based on pretreatment clinical parameters and 18 F-FDG PET/CT-derived textural features. Methods: From a prospectively maintained single-institution database, we reviewed 97 consecutive patients with locally advanced EC and a pretreatment 18 F-FDG PET/CT scan between 2009 and 2015. All patients were treated with nCRT (carboplatin/paclitaxel/41.4 Gy) followed by esophagectomy. We analyzed clinical, geometric, and pretreatment textural features extracted from both 18 F-FDG PET and CT. The current most accurate prediction model with SUV max as a predictor variable was compared with 6 different response prediction models constructed using least absolute shrinkage and selection operator regularized logistic regression. Internal validation was performed to estimate the model's performances. Pathologic response was defined as complete versus incomplete response (Mandard tumor regression grade system 1 vs. 2-5). Results: Pathologic examination revealed 19 (19.6%) complete and 78 (80.4%) incomplete responders. Least absolute shrinkage and selection operator regularization selected the clinical parameters: histologic type and clinical T stage, the 18 F-FDG PET-derived textural feature long run low gray level emphasis, and the CT-derived textural feature run percentage. Introducing these variables to a logistic regression analysis showed areas under the receiver-operating-characteristic curve (AUCs) of 0.78 compared with 0.58 in the SUV max model. The discrimination slopes were 0.17 compared with 0.01, respectively. After internal validation, the AUCs decreased to 0.74 and 0.54, respectively. Conclusion: The predictive values of the constructed models were superior to the standard method (SUV max ). These results can be considered as an initial step in predicting tumor response to nCRT in locally advanced EC. Further research in refining the predictive value of these models is needed to justify omission of surgery. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  16. Calibration sets selection strategy for the construction of robust PLS models for prediction of biodiesel/diesel blends physico-chemical properties using NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Palou, Anna; Miró, Aira; Blanco, Marcelo; Larraz, Rafael; Gómez, José Francisco; Martínez, Teresa; González, Josep Maria; Alcalà, Manel

    2017-06-01

    Even when the feasibility of using near infrared (NIR) spectroscopy combined with partial least squares (PLS) regression for prediction of physico-chemical properties of biodiesel/diesel blends has been widely demonstrated, inclusion in the calibration sets of the whole variability of diesel samples from diverse production origins still remains as an important challenge when constructing the models. This work presents a useful strategy for the systematic selection of calibration sets of samples of biodiesel/diesel blends from diverse origins, based on a binary code, principal components analysis (PCA) and the Kennard-Stones algorithm. Results show that using this methodology the models can keep their robustness over time. PLS calculations have been done using a specialized chemometric software as well as the software of the NIR instrument installed in plant, and both produced RMSEP under reproducibility values of the reference methods. The models have been proved for on-line simultaneous determination of seven properties: density, cetane index, fatty acid methyl esters (FAME) content, cloud point, boiling point at 95% of recovery, flash point and sulphur.

  17. A rapid method for differentiating Saccharomyces sensu stricto strains from other yeast species in an enological environment.

    PubMed

    Nardi, Tiziana; Carlot, Milena; De Bortoli, Elena; Corich, Viviana; Giacomini, Alessio

    2006-11-01

    During programs for the selection of enological yeasts, several hundred natural isolates are usually screened. The scope of these operations is to isolate strains possessing good fermentative properties without necessarily arriving at a precise species designation: in other words, to detect strains belonging to the Saccharomyces sensu stricto complex. In the present study, a pair of primers, designed within the variable D1/D2 region of the 26S subunit of ribosomal yeast RNA, have been constructed. These generate an amplification fragment of 471 bp that is specific for the seven Saccharomyces sensu stricto species, while no signal was obtained for Saccharomyces sensu lato strains (17 species) or for another 18 selected species commonly found in enological environments. A second pair of primers was also constructed, within the 18S rRNA gene, composed of perfectly conserved sequences common for all 42 yeast species examined, which generate a 900 bp (c.) band for all strains. This was used as a positive experimental control in multiplex PCR analysis using all four primers.

  18. Rapid evolution of troglomorphic characters suggests selection rather than neutral mutation as a driver of eye reduction in cave crabs.

    PubMed

    Klaus, Sebastian; Mendoza, José C E; Liew, Jia Huan; Plath, Martin; Meier, Rudolf; Yeo, Darren C J

    2013-04-23

    This study asked whether reductive traits in cave organisms evolve at a slower pace (suggesting neutral evolution under relaxed selection) than constructive changes, which are likely to evolve under directional selection. We investigated 11 subterranean and seven surface populations of Sundathelphusa freshwater crabs on Bohol Island, Philippines, and examined constructive traits associated with improved food finding in darkness (increased leg and setae length) and reductive traits (reduced cornea size and eyestalk length). All changes occurred rapidly, given that the age of the most recent common ancestor was estimated to be 722-271 ka based on three mitochondrial markers. In order to quantify the speed of character change, we correlated the degree of morphological change with genetic distances between surface and subterranean individuals. The temporal pattern of character change following the transition to subterranean life was indistinguishable for constructive and reductive traits, characterized by an immediate onset and rapid evolutionary change. We propose that the evolution of these reductive traits-just like constructive traits-is most likely driven by strong directional selection.

  19. Ensemble Feature Learning of Genomic Data Using Support Vector Machine

    PubMed Central

    Anaissi, Ali; Goyal, Madhu; Catchpoole, Daniel R.; Braytee, Ali; Kennedy, Paul J.

    2016-01-01

    The identification of a subset of genes having the ability to capture the necessary information to distinguish classes of patients is crucial in bioinformatics applications. Ensemble and bagging methods have been shown to work effectively in the process of gene selection and classification. Testament to that is random forest which combines random decision trees with bagging to improve overall feature selection and classification accuracy. Surprisingly, the adoption of these methods in support vector machines has only recently received attention but mostly on classification not gene selection. This paper introduces an ensemble SVM-Recursive Feature Elimination (ESVM-RFE) for gene selection that follows the concepts of ensemble and bagging used in random forest but adopts the backward elimination strategy which is the rationale of RFE algorithm. The rationale behind this is, building ensemble SVM models using randomly drawn bootstrap samples from the training set, will produce different feature rankings which will be subsequently aggregated as one feature ranking. As a result, the decision for elimination of features is based upon the ranking of multiple SVM models instead of choosing one particular model. Moreover, this approach will address the problem of imbalanced datasets by constructing a nearly balanced bootstrap sample. Our experiments show that ESVM-RFE for gene selection substantially increased the classification performance on five microarray datasets compared to state-of-the-art methods. Experiments on the childhood leukaemia dataset show that an average 9% better accuracy is achieved by ESVM-RFE over SVM-RFE, and 5% over random forest based approach. The selected genes by the ESVM-RFE algorithm were further explored with Singular Value Decomposition (SVD) which reveals significant clusters with the selected data. PMID:27304923

  20. Liquid rocket booster study. Volume 2, book 4, appendices 6-8: Reports of Rocketdyne, Pratt and Whitney, and TRW

    NASA Technical Reports Server (NTRS)

    1988-01-01

    For the pressure fed engines, detailed trade studies were conducted defining engine features such as thrust vector control methods, thrust chamber construction, etc. This was followed by engine design layouts and booster propulsion configuration layouts. For the pump fed engines parametric performance and weight data was generated for both O2/H2 and O2/RP-1 engines. Subsequent studies resulted in the selection of both LOX/RP-1 and O2/H2 propellants for the pump fed engines. More detailed analysis of the selected LOX/RP-1 and O2/H2 engines was conducted during the final phase of the study.

  1. Rapid construction of pinhole SPECT system matrices by distance-weighted Gaussian interpolation method combined with geometric parameter estimations

    NASA Astrophysics Data System (ADS)

    Lee, Ming-Wei; Chen, Yi-Chun

    2014-02-01

    In pinhole SPECT applied to small-animal studies, it is essential to have an accurate imaging system matrix, called H matrix, for high-spatial-resolution image reconstructions. Generally, an H matrix can be obtained by various methods, such as measurements, simulations or some combinations of both methods. In this study, a distance-weighted Gaussian interpolation method combined with geometric parameter estimations (DW-GIMGPE) is proposed. It utilizes a simplified grid-scan experiment on selected voxels and parameterizes the measured point response functions (PRFs) into 2D Gaussians. The PRFs of missing voxels are interpolated by the relations between the Gaussian coefficients and the geometric parameters of the imaging system with distance-weighting factors. The weighting factors are related to the projected centroids of voxels on the detector plane. A full H matrix is constructed by combining the measured and interpolated PRFs of all voxels. The PRFs estimated by DW-GIMGPE showed similar profiles as the measured PRFs. OSEM reconstructed images of a hot-rod phantom and normal rat myocardium demonstrated the effectiveness of the proposed method. The detectability of a SKE/BKE task on a synthetic spherical test object verified that the constructed H matrix provided comparable detectability to that of the H matrix acquired by a full 3D grid-scan experiment. The reduction in the acquisition time of a full 1.0-mm grid H matrix was about 15.2 and 62.2 times with the simplified grid pattern on 2.0-mm and 4.0-mm grid, respectively. A finer-grid H matrix down to 0.5-mm spacing interpolated by the proposed method would shorten the acquisition time by 8 times, additionally.

  2. In silico design, construction and cloning of Trastuzumab humanized monoclonal antibody: A possible biosimilar for Herceptin

    PubMed Central

    Akbarzadeh-Sharbaf, Soudabeh; Yakhchali, Bagher; Minuchehr, Zarrin; Shokrgozar, Mohammad Ali; Zeinali, Sirous

    2012-01-01

    Background: There is a novel hypothesis in that antibodies may have specificity for two distinct antigens that have been named “dual specificity”. This hypothesis was evaluated for some defined therapeutic monoclonal antibodies (mAbs) such as Trastuzumab, Pertuzumab, Bevacizumab, and Cetuximab. In silico design and construction of expression vectors for trastuzumab monoclonal antibody also in this work were performed. Materials and Methods: First, in bioinformatics studies the 3D structures of concerned mAbs were obtained from the Protein Data Bank (PDB). Three-dimensional structural alignments were performed with SIM and MUSTANG softwares. AutoDock4.2 software also was used for the docking analysis. Second, the suitable genes for trastuzumab heavy and light chains were designed, synthesized, and cloned in the prokaryotic vector. These fragments individually were PCR amplified and cloned into pcDNA™ 3.3-TOPO® and pOptiVEC™ TOPO® shuttle vectors, using standard methods. Results: First, many bioinformatics tools and softwares were applied but we did not meet any new dual specificity in the selected antibodies. In the following step, the suitable expression cascade for the heavy and light chains of Trastuzumab therapeutic mAb were designed and constructed. Gene cloning was successfully performed and created constructs were confirmed using gene mapping and sequencing. Conclusions: This study was based on a recently developed technology for mAb expression in mammalian cells. The obtained constructs could be successfully used for biosimilar recombinant mAb production in CHO DG44 dihydrofolate reductase (DHFR) gene deficient cell line in the suspension culture medium. PMID:23210080

  3. Assessing the overuse of antibiotics in children in Saudi Arabia: validation of the parental perception on antibiotics scale (PAPA scale)

    PubMed Central

    2013-01-01

    Background Antibiotics overuse is a global public health issue influenced by several factors, of which some are parent-related psychosocial factors that can only be measured using valid and reliable psychosocial measurement instruments. The PAPA scale was developed to measure these factors and the content validity of this instrument was assessed. Aim This study further validated the recently developed instrument in terms of (1) face validity and (2) construct validity including: deciding the number and nature of factors, and item selection. Methods Questionnaires were self-administered to parents of children between the ages of 0 and 12 years old. Parents were conveniently recruited from schools’ parental meetings in the Eastern Province, Saudi Arabia. Face validity was assessed with regards to questionnaire clarity and unambiguity. Construct validity and item selection processes were conducted using Exploratory factor analysis. Results Parallel analysis and Exploratory factor analysis using principal axis factoring produced six factors in the developed instrument: knowledge and beliefs, behaviours, sources of information, adherence, awareness about antibiotics resistance, and parents’ perception regarding doctors’ prescribing behaviours. Reliability was assessed (Cronbach’s alpha = 0.78) which demonstrates the instrument as being reliable. Conclusion The ‘factors’ produced in this study coincide with the constructs contextually identified in the development phase of other instruments used to study antibiotic use. However, no other study considering perceptions of antibiotic use had gone beyond content validation of such instruments. This study is the first to constructively validate the factors underlying perceptions regarding antibiotic use in any population and in parents in particular. PMID:23497151

  4. Selecting Paradigms From Cognitive Neuroscience for Translation into Use in Clinical Trials: Proceedings of the Third CNTRICS Meeting

    PubMed Central

    Barch, Deanna M.; Carter, Cameron S.; Arnsten, Amy; Buchanan, Robert W.; Cohen, Jonathan D.; Geyer, Mark; Green, Michael F.; Krystal, John H.; Nuechterlein, Keith; Robbins, Trevor; Silverstein, Steven; Smith, Edward E.; Strauss, Milton; Wykes, Til; Heinssen, Robert

    2009-01-01

    This overview describes the goals and objectives of the third conference conducted as part of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) initiative. This third conference was focused on selecting specific paradigms from cognitive neuroscience that measured the constructs identified in the first CNTRICS meeting, with the goal of facilitating the translation of these paradigms into use in clinical trials contexts. To identify such paradigms, we had an open nomination process in which the field was asked to nominate potentially relevant paradigms and to provide information on several domains relevant to selecting the most promising tasks for each construct (eg, construct validity, neural bases, psychometrics, availability of animal models). Our goal was to identify 1–2 promising tasks for each of the 11 constructs identified at the first CNTRICS meeting. In this overview article, we describe the on-line survey used to generate nominations for promising tasks, the criteria that were used to select the tasks, the rationale behind the criteria, and the ways in which breakout groups worked together to identify the most promising tasks from among those nominated. This article serves as an introduction to the set of 6 articles included in this special issue that provide information about the specific tasks discussed and selected for the constructs from each of 6 broad domains (working memory, executive control, attention, long-term memory, perception, and social cognition). PMID:19023126

  5. Extravehicular Crewman Work System (ECWS) study program. Volume 2: Construction

    NASA Technical Reports Server (NTRS)

    Wilde, R. C.

    1980-01-01

    The construction portion of the Extravehicular Crewman Work System Study defines the requirements and selects the concepts for the crewman work system required to support the construction of large structures in space.

  6. A tool for selecting SNPs for association studies based on observed linkage disequilibrium patterns.

    PubMed

    De La Vega, Francisco M; Isaac, Hadar I; Scafe, Charles R

    2006-01-01

    The design of genetic association studies using single-nucleotide polymorphisms (SNPs) requires the selection of subsets of the variants providing high statistical power at a reasonable cost. SNPs must be selected to maximize the probability that a causative mutation is in linkage disequilibrium (LD) with at least one marker genotyped in the study. The HapMap project performed a genome-wide survey of genetic variation with about a million SNPs typed in four populations, providing a rich resource to inform the design of association studies. A number of strategies have been proposed for the selection of SNPs based on observed LD, including construction of metric LD maps and the selection of haplotype tagging SNPs. Power calculations are important at the study design stage to ensure successful results. Integrating these methods and annotations can be challenging: the algorithms required to implement these methods are complex to deploy, and all the necessary data and annotations are deposited in disparate databases. Here, we present the SNPbrowser Software, a freely available tool to assist in the LD-based selection of markers for association studies. This stand-alone application provides fast query capabilities and swift visualization of SNPs, gene annotations, power, haplotype blocks, and LD map coordinates. Wizards implement several common SNP selection workflows including the selection of optimal subsets of SNPs (e.g. tagging SNPs). Selected SNPs are screened for their conversion potential to either TaqMan SNP Genotyping Assays or the SNPlex Genotyping System, two commercially available genotyping platforms, expediting the set-up of genetic studies with an increased probability of success.

  7. Protein construct storage: Bayesian variable selection and prediction with mixtures.

    PubMed

    Clyde, M A; Parmigiani, G

    1998-07-01

    Determining optimal conditions for protein storage while maintaining a high level of protein activity is an important question in pharmaceutical research. A designed experiment based on a space-filling design was conducted to understand the effects of factors affecting protein storage and to establish optimal storage conditions. Different model-selection strategies to identify important factors may lead to very different answers about optimal conditions. Uncertainty about which factors are important, or model uncertainty, can be a critical issue in decision-making. We use Bayesian variable selection methods for linear models to identify important variables in the protein storage data, while accounting for model uncertainty. We also use the Bayesian framework to build predictions based on a large family of models, rather than an individual model, and to evaluate the probability that certain candidate storage conditions are optimal.

  8. Attenuation relation for strong motion in Eastern Java based on appropriate database and method

    NASA Astrophysics Data System (ADS)

    Mahendra, Rian; Rohadi, Supriyanto; Rudyanto, Ariska

    2017-07-01

    The selection and determination of attenuation relation has become important for seismic hazard assessment in active seismic region. This research initially constructs the appropriate strong motion database, including site condition and type of the earthquake. The data set consisted of large number earthquakes of 5 ≤ Mw ≤ 9 and distance less than 500 km that occurred around Java from 2009 until 2016. The location and depth of earthquake are being relocated using double difference method to improve the quality of database. Strong motion data from twelve BMKG's accelerographs which are located in east Java is used. The site condition is known by using dominant period and Vs30. The type of earthquake is classified into crustal earthquake, interface, and intraslab based on slab geometry analysis. A total of 10 Ground Motion Prediction Equations (GMPEs) are tested using Likelihood (Scherbaum et al., 2004) and Euclidean Distance Ranking method (Kale and Akkar, 2012) with the associated database. The evaluation of these methods lead to a set of GMPEs that can be applied for seismic hazard in East Java where the strong motion data is collected. The result of these methods found that there is still high deviation of GMPEs, so the writer modified some GMPEs using inversion method. Validation was performed by analysing the attenuation curve of the selected GMPE and observation data in period 2015 up to 2016. The results show that the selected GMPE is suitable for estimated PGA value in East Java.

  9. Optical spectral singularities as threshold resonances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mostafazadeh, Ali

    2011-04-15

    Spectral singularities are among generic mathematical features of complex scattering potentials. Physically they correspond to scattering states that behave like zero-width resonances. For a simple optical system, we show that a spectral singularity appears whenever the gain coefficient coincides with its threshold value and other parameters of the system are selected properly. We explore a concrete realization of spectral singularities for a typical semiconductor gain medium and propose a method of constructing a tunable laser that operates at threshold gain.

  10. Blockade of Tumor Cell TGF-Betas: A Strategy to Reverse Antiestrogen Resistance in Human Breast Cancer

    DTIC Science & Technology

    2002-01-01

    the TM- FKHRL1 construct exhibited exclusive nuclear localization Cell Cycle Analysis by Flow Cytometry of the HA-tagged mutant under any experimental...distribution as measured by flow cytometry (Figure 8A). ALS AND METHODS. Consistent with its antiapoptotic effect, these results, addi- tion of TGFI3... flow cytometry . Under these conditions more than 95% of selected cells expressed GFP at the time of experiments. Immunoblot Analysis. Cells were

  11. Receptive fields selection for binary feature description.

    PubMed

    Fan, Bin; Kong, Qingqun; Trzcinski, Tomasz; Wang, Zhiheng; Pan, Chunhong; Fua, Pascal

    2014-06-01

    Feature description for local image patch is widely used in computer vision. While the conventional way to design local descriptor is based on expert experience and knowledge, learning-based methods for designing local descriptor become more and more popular because of their good performance and data-driven property. This paper proposes a novel data-driven method for designing binary feature descriptor, which we call receptive fields descriptor (RFD). Technically, RFD is constructed by thresholding responses of a set of receptive fields, which are selected from a large number of candidates according to their distinctiveness and correlations in a greedy way. Using two different kinds of receptive fields (namely rectangular pooling area and Gaussian pooling area) for selection, we obtain two binary descriptors RFDR and RFDG .accordingly. Image matching experiments on the well-known patch data set and Oxford data set demonstrate that RFD significantly outperforms the state-of-the-art binary descriptors, and is comparable with the best float-valued descriptors at a fraction of processing time. Finally, experiments on object recognition tasks confirm that both RFDR and RFDG successfully bridge the performance gap between binary descriptors and their floating-point competitors.

  12. An efficient procedure for marker-free mutagenesis of S. coelicolor by site-specific recombination for secondary metabolite overproduction.

    PubMed

    Zhang, Bo; Zhang, Lin; Dai, Ruixue; Yu, Meiying; Zhao, Guoping; Ding, Xiaoming

    2013-01-01

    Streptomyces bacteria are known for producing important natural compounds by secondary metabolism, especially antibiotics with novel biological activities. Functional studies of antibiotic-biosynthesizing gene clusters are generally through homologous genomic recombination by gene-targeting vectors. Here, we present a rapid and efficient method for construction of gene-targeting vectors. This approach is based on Streptomyces phage φBT1 integrase-mediated multisite in vitro site-specific recombination. Four 'entry clones' were assembled into a circular plasmid to generate the destination gene-targeting vector by a one-step reaction. The four 'entry clones' contained two clones of the upstream and downstream flanks of the target gene, a selectable marker and an E. coli-Streptomyces shuttle vector. After targeted modification of the genome, the selectable markers were removed by φC31 integrase-mediated in vivo site-specific recombination between pre-placed attB and attP sites. Using this method, part of the calcium-dependent antibiotic (CDA) and actinorhodin (Act) biosynthetic gene clusters were deleted, and the rrdA encoding RrdA, a negative regulator of Red production, was also deleted. The final prodiginine production of the engineered strain was over five times that of the wild-type strain. This straightforward φBT1 and φC31 integrase-based strategy provides an alternative approach for rapid gene-targeting vector construction and marker removal in streptomycetes.

  13. Development of an Indirect Stereolithography Technology for Scaffold Fabrication with a Wide Range of Biomaterial Selectivity

    PubMed Central

    Kang, Hyun-Wook

    2012-01-01

    Tissue engineering, which is the study of generating biological substitutes to restore or replace tissues or organs, has the potential to meet current needs for organ transplantation and medical interventions. Various approaches have been attempted to apply three-dimensional (3D) solid freeform fabrication technologies to tissue engineering for scaffold fabrication. Among these, the stereolithography (SL) technology not only has the highest resolution, but also offers quick fabrication. However, a lack of suitable biomaterials is a barrier to applying the SL technology to tissue engineering. In this study, an indirect SL method that combines the SL technology and a sacrificial molding process was developed to address this challenge. A sacrificial mold with an inverse porous shape was fabricated from an alkali-soluble photopolymer by the SL technology. A sacrificial molding process was then developed for scaffold construction using a variety of biomaterials. The results indicated a wide range of biomaterial selectivity and a high resolution. Achievable minimum pore and strut sizes were as large as 50 and 65 μm, respectively. This technology can also be used to fabricate three-dimensional organ shapes, and combined with traditional fabrication methods to construct a new type of scaffold with a dual-pore size. Cytotoxicity tests, as well as nuclear magnetic resonance and gel permeation chromatography analyses, showed that this technology has great potential for tissue engineering applications. PMID:22443315

  14. Photolithography-Based Patterning of Liquid Metal Interconnects for Monolithically Integrated Stretchable Circuits.

    PubMed

    Park, Chan Woo; Moon, Yu Gyeong; Seong, Hyejeong; Jung, Soon Won; Oh, Ji-Young; Na, Bock Soon; Park, Nae-Man; Lee, Sang Seok; Im, Sung Gap; Koo, Jae Bon

    2016-06-22

    We demonstrate a new patterning technique for gallium-based liquid metals on flat substrates, which can provide both high pattern resolution (∼20 μm) and alignment precision as required for highly integrated circuits. In a very similar manner as in the patterning of solid metal films by photolithography and lift-off processes, the liquid metal layer painted over the whole substrate area can be selectively removed by dissolving the underlying photoresist layer, leaving behind robust liquid patterns as defined by the photolithography. This quick and simple method makes it possible to integrate fine-scale interconnects with preformed devices precisely, which is indispensable for realizing monolithically integrated stretchable circuits. As a way for constructing stretchable integrated circuits, we propose a hybrid configuration composed of rigid device regions and liquid interconnects, which is constructed on a rigid substrate first but highly stretchable after being transferred onto an elastomeric substrate. This new method can be useful in various applications requiring both high-resolution and precisely aligned patterning of gallium-based liquid metals.

  15. Characterizing air quality data from complex network perspective.

    PubMed

    Fan, Xinghua; Wang, Li; Xu, Huihui; Li, Shasha; Tian, Lixin

    2016-02-01

    Air quality depends mainly on changes in emission of pollutants and their precursors. Understanding its characteristics is the key to predicting and controlling air quality. In this study, complex networks were built to analyze topological characteristics of air quality data by correlation coefficient method. Firstly, PM2.5 (particulate matter with aerodynamic diameter less than 2.5 μm) indexes of eight monitoring sites in Beijing were selected as samples from January 2013 to December 2014. Secondly, the C-C method was applied to determine the structure of phase space. Points in the reconstructed phase space were considered to be nodes of the network mapped. Then, edges were determined by nodes having the correlation greater than a critical threshold. Three properties of the constructed networks, degree distribution, clustering coefficient, and modularity, were used to determine the optimal value of the critical threshold. Finally, by analyzing and comparing topological properties, we pointed out that similarities and difference in the constructed complex networks revealed influence factors and their different roles on real air quality system.

  16. Half-quadratic variational regularization methods for speckle-suppression and edge-enhancement in SAR complex image

    NASA Astrophysics Data System (ADS)

    Zhao, Xia; Wang, Guang-xin

    2008-12-01

    Synthetic aperture radar (SAR) is an active remote sensing sensor. It is a coherent imaging system, the speckle is its inherent default, which affects badly the interpretation and recognition of the SAR targets. Conventional methods of removing the speckle is studied usually in real SAR image, which reduce the edges of the images at the same time as depressing the speckle. Morever, Conventional methods lost the information about images phase. Removing the speckle and enhancing the target and edge simultaneously are still a puzzle. To suppress the spckle and enhance the targets and the edges simultaneously, a half-quadratic variational regularization method in complex SAR image is presented, which is based on the prior knowledge of the targets and the edge. Due to the non-quadratic and non- convex quality and the complexity of the cost function, a half-quadratic variational regularization variation is used to construct a new cost function,which is solved by alternate optimization. In the proposed scheme, the construction of the model, the solution of the model and the selection of the model peremeters are studied carefully. In the end, we validate the method using the real SAR data.Theoretic analysis and the experimental results illustrate the the feasibility of the proposed method. Further more, the proposed method can preserve the information about images phase.

  17. Breastfeeding Duration and the Theory of Planned Behavior and Breastfeeding Self-Efficacy Framework: A Systematic Review of Observational Studies.

    PubMed

    Lau, Christine Y K; Lok, Kris Y W; Tarrant, Marie

    2018-03-01

    Introduction Numerous studies have shown that the constructs of the Theory of Reasoned Action (TRA), Theory of Planned Behavior (TPB) and Breastfeeding Self-Efficacy (BSE) Framework can effectively identify relationships between maternal psychosocial factors and breastfeeding initiation. However, the ability of these theories to predict breastfeeding duration has not been adequately analyzed. The aim of the review was to examine the utility of the constructs of TRA/TPB and BSE to predict breastfeeding duration. Methods We conducted a literature search using Pubmed (1980-May 2015), Medline (1966-May 2015), CINAHL (1980-May 2015), EMBASE (1980-May 2015) and PsycINFO (1980-May 2015). We selected studies that were observational studies without randomization or blinding, using TRA, TPB or BSE as the framework for analysis. Only studies reporting on breastfeeding duration were included. Results Thirty studies were selected, which include four using TRA, 10 using TPB, 15 using BSE and one using a combination of TPB and BSE. Maternal intention and breastfeeding self-efficacy were found to be important predictors of breastfeeding duration. Inconsistent findings were found in assessing the relationship between maternal attitudes, subjective norms, perceived behavior control and breastfeeding duration. Discussion The inadequacy of these constructs in explaining breastfeeding duration indicates a need to further explore the role of maternal self-determination in breastfeeding behavior.

  18. Semirational Approach for Ultrahigh Poly(3-hydroxybutyrate) Accumulation in Escherichia coli by Combining One-Step Library Construction and High-Throughput Screening.

    PubMed

    Li, Teng; Ye, Jianwen; Shen, Rui; Zong, Yeqing; Zhao, Xuejin; Lou, Chunbo; Chen, Guo-Qiang

    2016-11-18

    As a product of a multistep enzymatic reaction, accumulation of poly(3-hydroxybutyrate) (PHB) in Escherichia coli (E. coli) can be achieved by overexpression of the PHB synthesis pathway from a native producer involving three genes phbC, phbA, and phbB. Pathway optimization by adjusting expression levels of the three genes can influence properties of the final product. Here, we reported a semirational approach for highly efficient PHB pathway optimization in E. coli based on a phbCAB operon cloned from the native producer Ralstonia entropha (R. entropha). Rationally designed ribosomal binding site (RBS) libraries with defined strengths for each of the three genes were constructed based on high or low copy number plasmids in a one-pot reaction by an oligo-linker mediated assembly (OLMA) method. Strains with desired properties were evaluated and selected by three different methodologies, including visual selection, high-throughput screening, and detailed in-depth analysis. Applying this approach, strains accumulating 0%-92% PHB contents in cell dry weight (CDW) were achieved. PHB with various weight-average molecular weights (M w ) of 2.7-6.8 × 10 6 were also efficiently produced in relatively high contents. These results suggest that the semirational approach combining library design, construction, and proper screening is an efficient way to optimize PHB and other multienzyme pathways.

  19. 7 CFR 1753.77 - Methods of minor construction.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Methods of minor construction. 1753.77 Section 1753..., DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Minor Construction § 1753.77 Methods of minor construction. Minor construction may be performed by contract using RUS...

  20. Radon emanation based material measurement and selection for the SuperNEMO double beta experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerna, Cédric, E-mail: cerna@cenbg.in2p3.fr; Soulé, Benjamin; Perrot, Frédéric

    The SuperNEMO Demonstrator experiment aims to study the neutrinoless double beta decay of 7 kg of {sup 82}Se in order to reach a limit on the light Majorana neutrino mass mechanism T{sub 1/2} (ββ0ν) > 6.5 10{sup 24} years (90%CL) equivalent to a mass sensitivity mβ{sub β} < 0.20 - 0.40 eV (90%CL) in two years of data taking. The detector construction started in 2014 and its installation in the Laboratoire Souterrain de Modane (LSM) is expected during the course of 2015. The remaining level of {sup 226}Ra ({sup 238}U chain) in the detector components can lead to the emanationmore » of {sup 222}Rn gas. This isotope should be controlled and reduced down to the level of a 150 µBq/m{sup 3} in the tracker chamber of the detector to achieve the physics goals. Besides the HPGe selection of the detector materials for their radiopurity, the most critical materials have been tested and selected in a dedicated setup facility able to measure their {sup 222}Rn emanation level. The operating principle relies on a large emanation tank (0.7m{sup 3}) that allows measuring large material surfaces or large number of construction pieces. The emanation tank is coupled to an electrostatic detector equipped with a silicon diode to perform the alpha spectroscopy of the gas it contains and extract the {sup 222}Rn daughters. The transfer efficiency and the detector efficiency have been carefully calibrated through different methods. The intrinsic background of the system allows one to measure 222Rn activities down to 3 mBq, leading to a typical emanation sensitivity of 20 µBq/m{sup 2}/day for a 30 m{sup 2} surface sample. Several construction materials have been measured and selected, such as nylon and aluminized Mylar films, photomultipliers and tracking of the SuperNEMO Demonstrator.« less

Top