48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
27 CFR 19.634 - Computer-generated reports and transaction forms.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...
48 CFR 52.253-1 - Computer Generated Forms.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...
48 CFR 52.253-1 - Computer Generated Forms.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...
48 CFR 52.253-1 - Computer Generated Forms.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...
48 CFR 52.253-1 - Computer Generated Forms.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...
48 CFR 52.253-1 - Computer Generated Forms.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... Regulation; Clarification of Standards for Computer Generation of Forms AGENCY: Department of Defense (DoD... American National Standards Institute X12, as the valid standard to use for computer-generated forms. FAR... optional forms on their computers. In addition to clarifying that FIPS 161 is no longer in use, public...
13 CFR 120.194 - Use of computer forms.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Use of computer forms. 120.194... Applying to All Business Loans Computerized Sba Forms § 120.194 Use of computer forms. Any Applicant or Participant may use computer generated SBA application forms, closing forms, and other forms designated by SBA...
13 CFR 120.194 - Use of computer forms.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Use of computer forms. 120.194... Applying to All Business Loans Computerized Sba Forms § 120.194 Use of computer forms. Any Applicant or Participant may use computer generated SBA application forms, closing forms, and other forms designated by SBA...
13 CFR 120.194 - Use of computer forms.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Use of computer forms. 120.194... Applying to All Business Loans Computerized Sba Forms § 120.194 Use of computer forms. Any Applicant or Participant may use computer generated SBA application forms, closing forms, and other forms designated by SBA...
13 CFR 120.194 - Use of computer forms.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Use of computer forms. 120.194... Applying to All Business Loans Computerized Sba Forms § 120.194 Use of computer forms. Any Applicant or Participant may use computer generated SBA application forms, closing forms, and other forms designated by SBA...
13 CFR 120.194 - Use of computer forms.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Use of computer forms. 120.194... Applying to All Business Loans Computerized Sba Forms § 120.194 Use of computer forms. Any Applicant or Participant may use computer generated SBA application forms, closing forms, and other forms designated by SBA...
21 CFR 514.80 - Records and reports concerning experience with approved new animal drugs.
Code of Federal Regulations, 2014 CFR
2014-04-01
... and 2301? How can I get them? Can I use computer-generated equivalents? 514.80(d) Reporting forms. How... provided on the forms. Computer-generated equivalents of Form FDA 1932 or Form FDA 2301, approved by FDA... Medicine, Division of Surveillance (HFV-210), 7500 Standish Pl., Rockville, MD 20855-2764. (e) Records to...
21 CFR 514.80 - Records and reports concerning experience with approved new animal drugs.
Code of Federal Regulations, 2012 CFR
2012-04-01
... and 2301? How can I get them? Can I use computer-generated equivalents? 514.80(d) Reporting forms. How... provided on the forms. Computer-generated equivalents of Form FDA 1932 or Form FDA 2301, approved by FDA... Medicine, Division of Surveillance (HFV-210), 7500 Standish Pl., Rockville, MD 20855-2764. (e) Records to...
21 CFR 514.80 - Records and reports concerning experience with approved new animal drugs.
Code of Federal Regulations, 2013 CFR
2013-04-01
... and 2301? How can I get them? Can I use computer-generated equivalents? 514.80(d) Reporting forms. How... provided on the forms. Computer-generated equivalents of Form FDA 1932 or Form FDA 2301, approved by FDA... Medicine, Division of Surveillance (HFV-210), 7500 Standish Pl., Rockville, MD 20855-2764. (e) Records to...
21 CFR 514.80 - Records and reports concerning experience with approved new animal drugs.
Code of Federal Regulations, 2011 CFR
2011-04-01
... and 2301? How can I get them? Can I use computer-generated equivalents? 514.80(d) Reporting forms. How... provided on the forms. Computer-generated equivalents of Form FDA 1932 or Form FDA 2301, approved by FDA... Medicine, Division of Surveillance (HFV-210), 7500 Standish Pl., Rockville, MD 20855-2764. (e) Records to...
21 CFR 514.80 - Records and reports concerning experience with approved new animal drugs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... and 2301? How can I get them? Can I use computer-generated equivalents? 514.80(d) Reporting forms. How... provided on the forms. Computer-generated equivalents of Form FDA 1932 or Form FDA 2301, approved by FDA... Medicine, Division of Surveillance (HFV-210), 7500 Standish Pl., Rockville, MD 20855-2764. (e) Records to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.
1979-07-01
User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.
A study of sound generation in subsonic rotors, volume 2
NASA Technical Reports Server (NTRS)
Chalupnik, J. D.; Clark, L. T.
1975-01-01
Computer programs were developed for use in the analysis of sound generation by subsonic rotors. Program AIRFOIL computes the spectrum of radiated sound from a single airfoil immersed in a laminar flow field. Program ROTOR extends this to a rotating frame, and provides a model for sound generation in subsonic rotors. The program also computes tone sound generation due to steady state forces on the blades. Program TONE uses a moving source analysis to generate a time series for an array of forces moving in a circular path. The resultant time series are than Fourier transformed to render the results in spectral form. Program SDATA is a standard time series analysis package. It reads in two discrete time series and forms auto and cross covariances and normalizes these to form correlations. The program then transforms the covariances to yield auto and cross power spectra by means of a Fourier transformation.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Interactive Computation for Undergraduates: The Next Generation
NASA Astrophysics Data System (ADS)
Kolan, Amy J.
2017-05-01
A generation ago (29 years ago), Leo Kadanoff and Michael Vinson created the Computers, Chaos, and Physics course. A major pedagogical thrust of this course was to help students form and test hypotheses via computer simulation of small problems in physics. Recently, this aspect of the 1987 course has been revived for use with first year physics undergraduate students at St. Olaf College.
IDEA Technical Report No. 4. Description of IDEA Standard Form Data Base.
ERIC Educational Resources Information Center
Cashin, William E.; Perrin, Bruce M.
The data and computational procedures used by the IDEA System to generate IDEA Reports from information collected on the Standard Form of the IDEA Survey Form are described in this technical report. The computations for each of the seven parts of the IDEA Report are explained. The data base used for this 1978-79 Kansas State University study…
ERIC Educational Resources Information Center
Yeo, Tiong-Meng; Quek, Choon-Lang
2014-01-01
This comparative study investigates how two groups of design and technology students generated ideas in an asynchronous computer-mediated communication setting. The generated ideas were design ideas in the form of sketches. Each group comprised five students who were all 15 years of age. All the students were from the same secondary school but…
The Generative Effects of Instructional Organizers with Computer-Based Interactive Video.
ERIC Educational Resources Information Center
Kenny, Richard F.
This study compared the use of three instructional organizers--the advance organizer (AO), the participatory pictorial graphic organizer (PGO), and the final form pictorial graphic organizer (FGO)--in the design and use of computer-based interactive video (CBIV) programs. That is, it attempted to determine whether a less generative or more…
An Empirical Generative Framework for Computational Modeling of Language Acquisition
ERIC Educational Resources Information Center
Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-01-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…
Space station needs, attributes and architectural options study
NASA Technical Reports Server (NTRS)
1983-01-01
All the candidate Technology Development missions investigated during the space station needs, attributes, and architectural options study are described. All the mission data forms plus additional information such as, cost, drawings, functional flows, etc., generated in support of these mission is included with a computer generated mission data form.
48 CFR 1913.505-2 - Board order forms in lieu of Optional and Standard Forms.
Code of Federal Regulations, 2011 CFR
2011-10-01
... BROADCASTING BOARD OF GOVERNORS CONTRACTING METHODS AND CONTRACT TYPES SMALL PURCHASES AND OTHER SIMPLIFIED...-case basis, in order to accommodate computer-generated purchase order forms. Exception approval for...
48 CFR 1913.505-2 - Board order forms in lieu of Optional and Standard Forms.
Code of Federal Regulations, 2010 CFR
2010-10-01
... BROADCASTING BOARD OF GOVERNORS CONTRACTING METHODS AND CONTRACT TYPES SMALL PURCHASES AND OTHER SIMPLIFIED...-case basis, in order to accommodate computer-generated purchase order forms. Exception approval for...
NASA Astrophysics Data System (ADS)
Narayanaswami, Chandra; Raghunath, Mandayam T.
2004-09-01
We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.
Shick, G L; Hoover, L W; Moore, A N
1979-04-01
A data base was developed for a computer-assisted personnel data system for a university hospital department of dietetics which would store data on employees' employment, personnel information, attendance records, and termination. Development of the data base required designing computer programs and files, coding directions and forms for card input, and forms and procedures for on-line transmission. A program was written to compute accrued vacation, sick leave, and holiday time, and to generate historical records.
The Nature of Computer Assisted Learning.
ERIC Educational Resources Information Center
Whiting, John
Computer assisted learning (CAL) is an old technology which has generated much new interest. Computers can: reduce data to a directly comprehensible form; reduce administration; communicate worldwide and exchange, store, and retrieve data; and teach. The computer's limitation is in its dependence on the user's ability and perceptive nature.…
3D Model Generation From the Engineering Drawing
NASA Astrophysics Data System (ADS)
Vaský, Jozef; Eliáš, Michal; Bezák, Pavol; Červeňanská, Zuzana; Izakovič, Ladislav
2010-01-01
The contribution deals with the transformation of engineering drawings in a paper form into a 3D computer representation. A 3D computer model can be further processed in CAD/CAM system, it can be modified, archived, and a technical drawing can be then generated from it as well. The transformation process from paper form to the data one is a complex and difficult one, particularly owing to the different types of drawings, forms of displayed objects and encountered errors and deviations from technical standards. The algorithm for 3D model generating from an orthogonal vector input representing a simplified technical drawing of the rotational part is described in this contribution. The algorithm was experimentally implemented as ObjectARX application in the AutoCAD system and the test sample as the representation of the rotational part was used for verificaton.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-16
... Payment Request for the VA Funding Fee Payment System (VA FFPS); a Computer Generated Funding Fee Receipt.... 2900-0474.'' SUPPLEMENTARY INFORMATION: Title: Create Payment Request for the VA Funding Fee Payment System (VA FFPS); a Computer Generated Funding Fee Receipt, VA Form 26-8986. OMB Control Number: 2900...
Teaching French Transformational Grammar by Means of Computer-Generated Video-Tapes.
ERIC Educational Resources Information Center
Adler, Alfred; Thomas, Jean Jacques
This paper describes a pilot program in an integrated media presentation of foreign languages and the production and usage of seven computer-generated video tapes which demonstrate various aspects of French syntax. This instructional set could form the basis for CAI lessons in which the student is presented images identical to those on the video…
Pseudo-random number generator for the Sigma 5 computer
NASA Technical Reports Server (NTRS)
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... Payment Request for the VA Funding Fee Payment System (VA FFPS); a Computer Generated Funding Fee Receipt.... Title: Create Payment Request for the VA Funding Fee Payment System (VA FFPS); A Computer Generated Funding Fee Receipt, VA Form 26-8986. OMB Control Number: 2900-0474. Type of Review: Revision of a...
ERIC Educational Resources Information Center
Shubik, Martin
The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…
78 FR 42819 - Proposed Collection; Comment Request for Form 709
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-17
... 709, United States Gift (and Generation-Skipping Transfer) Tax Return. DATES: Written comments should....gov . SUPPLEMENTARY INFORMATION: Title: United States Gift (and Generation-Skipping Transfer) Tax... transfers subject to the gift and generation-skipping transfer taxes and to compute these taxes. The IRS...
75 FR 38179 - Proposed Collection; Comment Request for Form 709
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... 709, United States Gift (and Generation-Skipping Transfer) Tax Return. DATES: Written comments should....gov . SUPPLEMENTARY INFORMATION: Title: United States Gift (and Generation-Skipping Transfer) Tax... transfers subject to the gift and generation-skipping transfer taxes and to compute these taxes. The IRS...
Microprocessor Control of Low Speed VSTOL Flight.
1979-06-08
Analog IAS Indicated Air Speed I/O Input/Output KIAS Knots, Indicated Air Speed NATOPS Naval Air Training and Operating Procedures Standardization SAS...computer programming necessary in the research, and contain, in the form of computer- generated time histories, the results of the project. -17- I...of the aircraft causes airflow over the wings and therefore produces aerodynamic lift. As the transition progresses, wing- generated lift gradually
Accretor: Generative Materiality in the Work of Driessens and Verstappen.
Whitelaw, Mitchell
2015-01-01
Accretor, by the Dutch artists Erwin Driessens and Maria Verstappen, is a generative artwork that adopts and adapts artificial life techniques to produce intricate three-dimensional forms. This article introduces and analyzes Accretor, considering the enigmatic quality of the generated objects and in particular the role of materiality in this highly computational work. Accretor demonstrates a tangled continuity between digital and physical domains, where the constraints and affordances of matter inform both formal processes and aesthetic interpretations. Drawing on Arp's notion of the concrete artwork and McCormack and Dorin's notion of the computational sublime, the article finally argues that Accretor demonstrates what might be called a processual sublime, evoking expansive processes that span both computational and non-computational systems.
Implementation of control point form of algebraic grid-generation technique
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Miller, David P.; Reno, Charles J.
1991-01-01
The control point form (CPF) provides explicit control of physical grid shape and grid spacing through the movement of the control points. The control point array, called a control net, is a space grid type arrangement of locations in physical space with an index for each direction. As an algebraic method CPF is efficient and works well with interactive computer graphics. A family of menu-driven, interactive grid-generation computer codes (TURBO) is being developed by using CPF. Key features of TurboI (a TURBO member) are discussed and typical results are presented. TurboI runs on any IRIS 4D series workstation.
78 FR 70411 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
.... Title: United States Estate (and Generation-Skipping Transfer) Tax Return. Form: 706 and related schedules. Abstract: Form 706 is used by executors to report and compute the Federal Estate Tax imposed by... currently approved collection. Title: Return by a U.S. Transferor of Property to a Foreign Corporation. Form...
McMullin, Julie Ann; Duerden Comeau, Tammy; Jovic, Emily
2007-06-01
Sociologists theorizing the concept of 'generation' have traditionally looked to birth cohorts sharing major social upheavals such as war or decolonization to explain issues of generational solidarity and identity affiliation. More recently, theorists have drawn attention to the cultural elements where generations are thought to be formed through affinities with music or other types of popular culture during the 'coming of age' stage of life. In this paper, we ask whether developments in computer technology, which have both productive and cultural components, provide a basis for generational formation and identity and whether generational discourse is invoked to create cultures of difference in the workplace. Qualitative data from a sample of Information Technology workers show that these professionals mobilize 'generational' discourse and draw upon notions of 'generational affinity' with computing technology (e.g. the fact that people of different ages were immersed to varying degrees in different computing technologies) in explaining the youthful profile of IT workers and employees' differing levels of technological expertise.
Computing Shapes Of Cascade Diffuser Blades
NASA Technical Reports Server (NTRS)
Tran, Ken; Prueger, George H.
1993-01-01
Computer program generates sizes and shapes of cascade-type blades for use in axial or radial turbomachine diffusers. Generates shapes of blades rapidly, incorporating extensive cascade data to determine optimum incidence and deviation angle for blade design based on 65-series data base of National Advisory Commission for Aeronautics and Astronautics (NACA). Allows great variability in blade profile through input variables. Also provides for design of three-dimensional blades by allowing variable blade stacking. Enables designer to obtain computed blade-geometry data in various forms: as input for blade-loading analysis; as input for quasi-three-dimensional analysis of flow; or as points for transfer to computer-aided design.
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, Earl A.; Lipshutz, Robert J.; Morris, Macdonald S.; Winkler, James L.
1997-01-01
An improved set of computer tools for forming arrays. According to one aspect of the invention, a computer system is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files to design and/or generate lithographic masks.
2011-11-01
the Poisson form of the equations can also be generated by manipulating the computational space , so forcing functions become superfluous . The...ABSTRACT Unstructured methods for region discretization have become common in computational fluid dynamics (CFD) analysis because of certain benefits...application of Winslow elliptic smoothing equations to unstructured meshes. It has been shown that it is not necessary for the computational space of
On grey levels in random CAPTCHA generation
NASA Astrophysics Data System (ADS)
Newton, Fraser; Kouritzin, Michael A.
2011-06-01
A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.
Automatic Generation of Analogy Questions for Student Assessment: An Ontology-Based Approach
ERIC Educational Resources Information Center
Alsubait, Tahani; Parsia, Bijan; Sattler, Uli
2012-01-01
Different computational models for generating analogies of the form "A is to B as C is to D" have been proposed over the past 35 years. However, analogy generation is a challenging problem that requires further research. In this article, we present a new approach for generating analogies in Multiple Choice Question (MCQ) format that can be used…
Effects of Fluid Environment on Microbial Uptake Kinetics
1990-09-26
Marine snow parti- is crucial for the performance of all biological wastewater cles, large amorphous aggregates that form in marine sys - treatment...particle trajectories in computer models (Tambo and Wata- nabe 1979). These computer-generated aggregates, de- the %%ater column (Table 2). This analysis
Volume-preserving normal forms of Hopf-zero singularity
NASA Astrophysics Data System (ADS)
Gazor, Majid; Mokhtari, Fahimeh
2013-10-01
A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.
An empirical generative framework for computational modeling of language acquisition.
Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-06-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, Earl A.; Morris, MacDonald S.; Winkler, James L.
1999-01-05
An improved set of computer tools for forming arrays. According to one aspect of the invention, a computer system (100) is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files (104) to design and/or generate lithographic masks (110).
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, Earl A.; Morris, MacDonald S.; Winkler, James L.
1996-01-01
An improved set of computer tools for forming arrays. According to one aspect of the invention, a computer system (100) is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files (104) to design and/or generate lithographic masks (110).
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, E.A.; Morris, M.S.; Winkler, J.L.
1999-01-05
An improved set of computer tools for forming arrays is disclosed. According to one aspect of the invention, a computer system is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files to design and/or generate lithographic masks. 14 figs.
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, E.A.; Lipshutz, R.J.; Morris, M.S.; Winkler, J.L.
1997-01-14
An improved set of computer tools for forming arrays is disclosed. According to one aspect of the invention, a computer system is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files to design and/or generate lithographic masks. 14 figs.
Computer-aided engineering system for design of sequence arrays and lithographic masks
Hubbell, E.A.; Morris, M.S.; Winkler, J.L.
1996-11-05
An improved set of computer tools for forming arrays is disclosed. According to one aspect of the invention, a computer system is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files to design and/or generate lithographic masks. 14 figs.
Computer algebra and operators
NASA Technical Reports Server (NTRS)
Fateman, Richard; Grossman, Robert
1989-01-01
The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.
From micro to mainframe. A practical approach to perinatal data processing.
Yeh, S Y; Lincoln, T
1985-04-01
A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.
NASA Technical Reports Server (NTRS)
Jones, Robert E.; Kramarchuk, Ihor; Williams, Wallace D.; Pouch, John J.; Gilbert, Percy
1989-01-01
Computer-controlled thermal-wave microscope developed to investigate III-V compound semiconductor devices and materials. Is nondestructive technique providing information on subsurface thermal features of solid samples. Furthermore, because this is subsurface technique, three-dimensional imaging also possible. Microscope uses intensity-modulated electron beam of modified scanning electron microscope to generate thermal waves in sample. Acoustic waves generated by thermal waves received by transducer and processed in computer to form images displayed on video display of microscope or recorded on magnetic disk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
A Computer Program for the Calculation of Three-Dimensional Transonic Nacelle/Inlet Flowfields
NASA Technical Reports Server (NTRS)
Vadyak, J.; Atta, E. H.
1983-01-01
A highly efficient computer analysis was developed for predicting transonic nacelle/inlet flowfields. This algorithm can compute the three dimensional transonic flowfield about axisymmetric (or asymmetric) nacelle/inlet configurations at zero or nonzero incidence. The flowfield is determined by solving the full-potential equation in conservative form on a body-fitted curvilinear computational mesh. The difference equations are solved using the AF2 approximate factorization scheme. This report presents a discussion of the computational methods used to both generate the body-fitted curvilinear mesh and to obtain the inviscid flow solution. Computed results and correlations with existing methods and experiment are presented. Also presented are discussions on the organization of the grid generation (NGRIDA) computer program and the flow solution (NACELLE) computer program, descriptions of the respective subroutines, definitions of the required input parameters for both algorithms, a brief discussion on interpretation of the output, and sample cases to illustrate application of the analysis.
Single-Frame Cinema. Three Dimensional Computer-Generated Imaging.
ERIC Educational Resources Information Center
Cheetham, Edward Joseph, II
This master's thesis provides a description of the proposed art form called single-frame cinema, which is a category of computer imagery that takes the temporal polarities of photography and cinema and unites them into a single visual vignette of time. Following introductory comments, individual chapters discuss (1) the essential physical…
A Behavioral Study of Regularity, Irregularity and Rules in the English Past Tense
ERIC Educational Resources Information Center
Magen, Harriet S.
2014-01-01
Opposing views of storage and processing of morphologically complex words (e.g., past tense) have been suggested: the dual system, whereby regular forms are not in the lexicon but are generated by rule, while irregular forms are explicitly represented; the single system, whereby regular and irregular forms are computed by a single system, using…
Why do parallel cortical systems exist for the perception of static form and moving form?
Grossberg, S
1991-02-01
This article analyzes computational properties that clarify why the parallel cortical systems V1----V2, V1----MT, and V1----V2----MT exist for the perceptual processing of static visual forms and moving visual forms. The article describes a symmetry principle, called FM symmetry, that is predicted to govern the development of these parallel cortical systems by computing all possible ways of symmetrically gating sustained cells with transient cells and organizing these sustained-transient cells into opponent pairs of on-cells and off-cells whose output signals are insensitive to direction of contrast. This symmetric organization explains how the static form system (static BCS) generates emergent boundary segmentations whose outputs are insensitive to direction of contrast and insensitive to direction of motion, whereas the motion form system (motion BCS) generates emergent boundary segmentations whose outputs are insensitive to direction of contrast but sensitive to direction of motion. FM symmetry clarifies why the geometries of static and motion form perception differ--for example, why the opposite orientation of vertical is horizontal (90 degrees), but the opposite direction of up is down (180 degrees). Opposite orientations and directions are embedded in gated dipole opponent processes that are capable of antagonistic rebound. Negative afterimages, such as the MacKay and waterfall illusions, are hereby explained as are aftereffects of long-range apparent motion. These antagonistic rebounds help to control a dynamic balance between complementary perceptual states of resonance and reset. Resonance cooperatively links features into emergent boundary segmentations via positive feedback in a CC loop, and reset terminates a resonance when the image changes, thereby preventing massive smearing of percepts. These complementary preattentive states of resonance and reset are related to analogous states that govern attentive feature integration, learning, and memory search in adaptive resonance theory. The mechanism used in the V1----MT system to generate a wave of apparent motion between discrete flashes may also be used in other cortical systems to generate spatial shifts of attention. The theory suggests how the V1----V2----MT cortical stream helps to compute moving form in depth and how long-range apparent motion of illusory contours occurs. These results collectively argue against vision theories that espouse independent processing modules. Instead, specialized subsystems interact to overcome computational uncertainties and complementary deficiencies, to cooperatively bind features into context-sensitive resonances, and to realize symmetry principles that are predicted to govern the development of the visual cortex.
Integrated IMA (Information Mission Areas) IC (Information Center) Guide
1989-06-01
COMPUTER AIDED DESIGN / COMPUTER AIDED MANUFACTURE 8-8 8.3.7 LIQUID CRYSTAL DISPLAY PANELS 8-8 8.3.8 ARTIFICIAL INTELLIGENCE APPLIED TO VI 8-9 8.4...2 10.3.1 DESKTOP PUBLISHING 10-3 10.3.2 INTELLIGENT COPIERS 10-5 10.3.3 ELECTRONIC ALTERNATIVES TO PRINTED DOCUMENTS 10-5 10.3.4 ELECTRONIC FORMS...Optical Disk LCD Units Storage Image Scanners Graphics Forms Output Generation Copiers Devices Software Optical Disk Intelligent Storage Copiers Work Group
Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nataf, J.M.; Winkelmann, F.
We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less
Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nataf, J.M.; Winkelmann, F.
We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
Glazoff, Michael V.; Gering, Kevin L.; Garnier, John E.; Rashkeev, Sergey N.; Pyt'ev, Yuri Petrovich
2016-05-17
Embodiments discussed herein in the form of methods, systems, and computer-readable media deal with the application of advanced "projectional" morphological algorithms for solving a broad range of problems. In a method of performing projectional morphological analysis, an N-dimensional input signal is supplied. At least one N-dimensional form indicative of at least one feature in the N-dimensional input signal is identified. The N-dimensional input signal is filtered relative to the at least one N-dimensional form and an N-dimensional output signal is generated indicating results of the filtering at least as differences in the N-dimensional input signal relative to the at least one N-dimensional form.
Promoting Creativity through Assessment: A Formative Computer-Assisted Assessment Tool for Teachers
ERIC Educational Resources Information Center
Cropley, David; Cropley, Arthur
2016-01-01
Computer-assisted assessment (CAA) is problematic when it comes to fostering creativity, because in educational thinking the essence of creativity is not finding the correct answer but generating novelty. The idea of "functional" creativity provides rubrics that can serve as the basis for forms of CAA leading to either formative or…
Acquisition, representation and rule generation for procedural knowledge
NASA Technical Reports Server (NTRS)
Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen
1991-01-01
Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.
[Dental arch form reverting by four-point method].
Pan, Xiao-Gang; Qian, Yu-Fen; Weng, Si-En; Feng, Qi-Ping; Yu, Quan
2008-04-01
To explore a simple method of reverting individual dental arch form template for wire bending. Individual dental arch form was reverted by four-point method. By defining central point of bracket on bilateral lower second premolar and first molar, certain individual dental arch form could be generated. The arch form generating procedure was then be developed to computer software for printing arch form. Four-point method arch form was evaluated by comparing with direct model measurement on linear and angular parameters. The accuracy and reproducibility were assessed by paired t test and concordance correlation coefficient with Medcalc 9.3 software package. The arch form by four-point method was of good accuracy and reproducibility (linear concordance correlation coefficient was 0.9909 and angular concordance correlation coefficient was 0.8419). The dental arch form reverted by four-point method could reproduce the individual dental arch form.
NASA Technical Reports Server (NTRS)
Poole, L. R.; Lecroy, S. R.; Morris, W. D.
1977-01-01
A computer program for studying linear ocean wave refraction is described. The program features random-access modular bathymetry data storage. Three bottom topography approximation techniques are available in the program which provide varying degrees of bathymetry data smoothing. Refraction diagrams are generated automatically and can be displayed graphically in three forms: Ray patterns with specified uniform deepwater ray density, ray patterns with controlled nearshore ray density, or crest patterns constructed by using a cubic polynomial to approximate crest segments between adjacent rays.
Computer-aided design of bevel gear tooth surfaces
NASA Technical Reports Server (NTRS)
Shuo, Hung Chang; Huston, Ronald L.; Coy, John J.
1989-01-01
This paper presents a computer-aided design procedure for generating bevel gears. The development is based on examining a perfectly plastic, cone-shaped gear blank rolling over a cutting tooth on a plane crown rack. The resulting impression on the plastic gear blank is the envelope of the cutting tooth. This impression and envelope thus form a conjugate tooth surface. Equations are presented for the locus of points on the tooth surface. The same procedures are then extended to simulate the generation of a spiral bevel gear. The corresponding governing equations are presented.
Computer aided design of bevel gear tooth surfaces
NASA Technical Reports Server (NTRS)
Chang, S. H.; Huston, R. L.; Coy, J. J.
1989-01-01
This paper presents a computer-aided design procedure for generating bevel gears. The development is based on examining a perfectly plastic, cone-shaped gear blank rolling over a cutting tooth on a plane crown rack. The resulting impression on the plastic gear blank is the envelope of the cutting tooth. This impression and envelope thus form a conjugate tooth surface. Equations are presented for the locus of points on the tooth surface. The same procedures are then extended to simulate the generation of a spiral bevel gear. The corresponding governing equations are presented.
Fabrication of computer-generated holograms using femtosecond laser direct writing.
Berlich, René; Richter, Daniel; Richardson, Martin; Nolte, Stefan
2016-04-15
We demonstrate a single-step fabrication method for computer-generated holograms based on femtosecond laser direct writing. Therefore, a tightly arranged longitudinal waveguide array is directly inscribed into a transparent material. By tailoring the individual waveguide length, the phase profile of an incident laser beam can be arbitrarily adapted. The approach is verified in common borosilicate glass by inscribing a designed phase hologram, which forms the desired intensity pattern in its far field. The resulting performance is analyzed, and the potential as well as limitations of the method are discussed.
Generating the Infrared Spectra of Large Interstellar Molecules with Density Functional Theory
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Arnold, James (Technical Monitor)
1999-01-01
It is now possible to compute IR (infrared) spectra of large molecules with an accuracy of 30 per cm, or better, using density function theory. This is true for cations, anions, and neutrals. Thus it possible to generate synthetic IR spectra that can help interpret experimental spectra and fill in for missing experimental data. These synthetic spectra can be used as input into interstellar models. In addition to IR spectra, it is possible to compute energetic properties to help understand which molecules can be formed in the interstellar environment.
,
1992-01-01
US GeoData tapes are computer tapes which contain cartographic data in digital form. The 1:2,000,000-scale data are available in two forms. The graphic form can be used to generate computer-plotted maps. The content and scale of the maps can be varied to meet your needs. The topologically-structured form of US GeoData is suitable for input to geographic information systems for use in spatial analysis and geographic studies. Both forms must be used in conjunction with appropriate software. US GeoData tapes offer convenience, accuracy, flexibility, and cost effectiveness to many map users. Business, industry, and government users who are involved in network planning and analysis, transportation, demography, land use, or any activity where data can be related to, or plotted on a map will find US GeoData a valuable resource.
75 FR 1584 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
... Food Service Program Claim for Reimbursement Form is used to collect meal and cost data from sponsors to determine the reimbursement entitlement for meals served. The form is sent to the Food and... payment system computes earnings to date and the number of meals to date and generates payments for the...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Data Interchange or a format that can be translated into one of those standards. (b) The standards listed in paragraph (a)(2) of this section may also be used for submission of data set forth in other..., content, or sequence of the data elements, and the form carries the Standard or Optional Form number and...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Data Interchange or a format that can be translated into one of those standards. (b) The standards listed in paragraph (a)(2) of this section may also be used for submission of data set forth in other..., content, or sequence of the data elements, and the form carries the Standard or Optional Form number and...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Data Interchange or a format that can be translated into one of those standards. (b) The standards listed in paragraph (a)(2) of this section may also be used for submission of data set forth in other..., content, or sequence of the data elements, and the form carries the Standard or Optional Form number and...
Scout trajectory error propagation computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1982-01-01
Since 1969, flight experience has been used as the basis for predicting Scout orbital accuracy. The data used for calculating the accuracy consists of errors in the trajectory parameters (altitude, velocity, etc.) at stage burnout as observed on Scout flights. Approximately 50 sets of errors are used in Monte Carlo analysis to generate error statistics in the trajectory parameters. A covariance matrix is formed which may be propagated in time. The mechanization of this process resulted in computer program Scout Trajectory Error Propagation (STEP) and is described herein. Computer program STEP may be used in conjunction with the Statistical Orbital Analysis Routine to generate accuracy in the orbit parameters (apogee, perigee, inclination, etc.) based upon flight experience.
FRAN: financial ratio analysis and more (Version 2.0 for Windows)
Bruce G. Hansen; Arnold J., Jr. Palmer
1999-01-01
FRAN is a computer-based, stand-alone program designed to generate important financial and operating ratios from tax and wage forms filed with the Internal Revenue Service. FRAN generates standard profitability, financial/leverage, liquidity/solvency, and activity ratios, as well as unique measures of workforce and capital cost and acquisition. Information produced by...
Using Testbanking To Implement Classroom Management/Extension through the Use of Computers.
ERIC Educational Resources Information Center
Thommen, John D.
Testbanking provides teachers with an effective, low-cost, time-saving opportunity to improve the testing aspect of their classes. Testbanking, which involves the use of a testbank program and a computer, allows teachers to develop and generate tests and test-forms with a minimum of effort. Teachers who test using true and false, multiple choice,…
ERIC Educational Resources Information Center
Dickes, Amanda Catherine; Sengupta, Pratim; Farris, Amy Voss; Satabdi, Basu
2016-01-01
In this paper, we present a third-grade ecology learning environment that integrates two forms of modeling--embodied modeling and agent-based modeling (ABMs)--through the generation of mathematical representations that are common to both forms of modeling. The term "agent" in the context of ABMs indicates individual computational objects…
Potential implementation of reservoir computing models based on magnetic skyrmions
NASA Astrophysics Data System (ADS)
Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin
2018-05-01
Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.
A MATLAB based 3D modeling and inversion code for MT data
NASA Astrophysics Data System (ADS)
Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.
2017-07-01
The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.
NASA Technical Reports Server (NTRS)
Milner, E. J.; Krosel, S. M.
1977-01-01
Techniques are presented for determining the elements of the A, B, C, and D state variable matrices for systems simulated on an EAI Pacer 100 hybrid computer. An automated procedure systematically generates disturbance data necessary to linearize the simulation model and stores these data on a floppy disk. A separate digital program verifies this data, calculates the elements of the system matrices, and prints these matrices appropriately labeled. The partial derivatives forming the elements of the state variable matrices are approximated by finite difference calculations.
Code of Federal Regulations, 2011 CFR
2011-04-01
... following: Statements; declarations; documents; electronically generated or machine readable data; electronically stored or transmitted information or data; books; papers; correspondence; accounts; financial accounting data; technical data; computer programs necessary to retrieve information in a usable form; and...
A subsequent closed-form description of propagated signaling phenomena in the membrane of an axon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melendy, Robert F., E-mail: rfmelendy@liberty.edu
2016-05-15
I recently introduced a closed-form description of propagated signaling phenomena in the membrane of an axon [R.F. Melendy, Journal of Applied Physics 118, 244701 (2015)]. Those results demonstrate how intracellular conductance, the thermodynamics of magnetization, and current modulation, function together in generating an action potential in a unified, closed-form description. At present, I report on a subsequent closed-form model that unifies intracellular conductance and the thermodynamics of magnetization, with the membrane electric field, E{sub m}. It’s anticipated this work will compel researchers in biophysics, physical biology, and the computational neurosciences, to probe deeper into the classical and quantum features ofmore » membrane magnetization and signaling, informed by the computational features of this subsequent model.« less
Schwarz-Christoffel Conformal Mapping based Grid Generation for Global Oceanic Circulation Models
NASA Astrophysics Data System (ADS)
Xu, Shiming
2015-04-01
We propose new grid generation algorithms for global ocean general circulation models (OGCMs). Contrary to conventional, analytical forms based dipolar or tripolar grids, the new algorithm are based on Schwarz-Christoffel (SC) conformal mapping with prescribed boundary information. While dealing with the conventional grid design problem of pole relocation, it also addresses more advanced issues of computational efficiency and the new requirements on OGCM grids arisen from the recent trend of high-resolution and multi-scale modeling. The proposed grid generation algorithm could potentially achieve the alignment of grid lines to coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the generated grids are still orthogonal curvilinear, they can be readily 10 utilized in existing Bryan-Cox-Semtner type ocean models. The proposed methodology can also be applied to the grid generation task for regional ocean modeling when complex land-ocean distribution is present.
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1998-01-01
Dr. Donald Frazier,NASA researcher, uses a blue laser shining through a quarts window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center.
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1998-01-01
NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1999-01-01
NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Preliminary weight and costs of sandwich panels to distribute concentrated loads
NASA Technical Reports Server (NTRS)
Belleman, G.; Mccarty, J. E.
1976-01-01
Minimum mass honeycomb sandwich panels were sized for transmitting a concentrated load to a uniform reaction through various distances. The form skin gages were fully stressed with a finite element computer code. The panel general stability was evaluated with a buckling computer code labeled STAGS-B. Two skin materials were considered; aluminum and graphite-epoxy. The core was constant thickness aluminum honeycomb. Various panel sizes and load levels were considered. The computer generated data were generalized to allow preliminary least mass panel designs for a wide range of panel sizes and load intensities. An assessment of panel fabrication cost was also conducted. Various comparisons between panel mass, panel size, panel loading, and panel cost are presented in both tabular and graphical form.
A computational study on the interaction between a vortex and a shock wave
NASA Technical Reports Server (NTRS)
Meadows, Kristine R.; Kumar, Ajay; Hussaini, M. Y.
1989-01-01
A computational study of two-dimensional shock vortex interaction is discussed in this paper. A second order upwind finite volume method is used to solve the Euler equations in conservation form. In this method, the shock wave is captured rather than fitted so that the cases where shock vortex interaction may cause secondary shocks can also be investigated. The effects of vortex strength on the computed flow and acoustic field generated by the interaction are qualitatively evaluated.
Solid–Liquid Phase Change Driven by Internal Heat Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Crepeau; Ali s. Siahpush
2012-07-01
This article presents results of solid-liquid phase change, the Stefan Problem, where melting is driven internal heat generation, in a cylindrical geometry. The comparison between a quasi-static analytical solution for Stefan numbers less than one and numerical solutions shows good agreement. The computational results of phase change with internal heat generation show how convection cells form in the liquid region. A scale analysis of the same problem shows four distinct regions of the melting process.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
NASA Astrophysics Data System (ADS)
de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco
2013-05-01
This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.
NASA Technical Reports Server (NTRS)
Spuler, Linda M.; Ford, Patricia K.; Skeete, Darren C.; Hershman, Scot; Raviprakash, Pushpa; Arnold, John W.; Tran, Victor; Haenze, Mary Alice
2005-01-01
"Close Call Action Log Form" ("CCALF") is the name of both a computer program and a Web-based service provided by the program for creating an enhanced database of close calls (in the colloquial sense of mishaps that were avoided by small margins) assigned to the Center Operations Directorate (COD) at Johnson Space Center. CCALF provides a single facility for on-line collaborative review of close calls. Through CCALF, managers can delegate responses to employees. CCALF utilizes a pre-existing e-mail system to notify managers that there are close calls to review, but eliminates the need for the prior practices of passing multiple e-mail messages around the COD, then collecting and consolidating them into final responses: CCALF now collects comments from all responders for incorporation into reports that it generates. Also, whereas it was previously necessary to manually calculate metrics (e.g., numbers of maintenance-work orders necessitated by close calls) for inclusion in the reports, CCALF now computes the metrics, summarizes them, and displays them in graphical form. The reports and all pertinent information used to generate the reports are logged, tracked, and retained by CCALF for historical purposes.
2013-01-01
The anhydrate and the stoichiometric tetarto-hydrate of pyrogallol (0.25 mol water per mol pyrogallol) are both storage stable at ambient conditions, provided that they are phase pure, with the system being at equilibrium at aw (water activity) = 0.15 at 25 °C. Structures have been derived from single crystal and powder X-ray diffraction data for the anhydrate and hydrate, respectively. It is notable that the tetarto-hydrate forms a tetragonal structure with water in channels, a framework that although stabilized by water, is found as a higher energy structure on a computationally generated crystal energy landscape, which has the anhydrate crystal structure as the most stable form. Thus, a combination of slurry experiments, X-ray diffraction, spectroscopy, moisture (de)sorption, and thermo-analytical methods with the computationally generated crystal energy landscape and lattice energy calculations provides a consistent picture of the finely balanced hydration behavior of pyrogallol. In addition, two monotropically related dimethyl sulfoxide monosolvates were found in the accompanying solid form screen. PMID:24027438
Braun, Doris E; Bhardwaj, Rajni M; Arlin, Jean-Baptiste; Florence, Alastair J; Kahlenberg, Volker; Griesser, Ulrich J; Tocher, Derek A; Price, Sarah L
2013-09-04
The anhydrate and the stoichiometric tetarto-hydrate of pyrogallol (0.25 mol water per mol pyrogallol) are both storage stable at ambient conditions, provided that they are phase pure, with the system being at equilibrium at a w (water activity) = 0.15 at 25 °C. Structures have been derived from single crystal and powder X-ray diffraction data for the anhydrate and hydrate, respectively. It is notable that the tetarto-hydrate forms a tetragonal structure with water in channels, a framework that although stabilized by water, is found as a higher energy structure on a computationally generated crystal energy landscape, which has the anhydrate crystal structure as the most stable form. Thus, a combination of slurry experiments, X-ray diffraction, spectroscopy, moisture (de)sorption, and thermo-analytical methods with the computationally generated crystal energy landscape and lattice energy calculations provides a consistent picture of the finely balanced hydration behavior of pyrogallol. In addition, two monotropically related dimethyl sulfoxide monosolvates were found in the accompanying solid form screen.
This Is Your Future: A Case Study Approach to Foster Health Literacy
ERIC Educational Resources Information Center
Brey, Rebecca A.; Clark, Susan E.; Wantz, Molly S.
2008-01-01
Today's young people seem to live in an even faster fast-paced society than previous generations. As in the past, they are involved in sports, music, school, church, work, and are exposed to many forms of mass media that add to their base of information. However, they also have instant access to computer-generated information such as the Internet,…
A Study on Gröbner Basis with Inexact Input
NASA Astrophysics Data System (ADS)
Nagasaka, Kosaku
Gröbner basis is one of the most important tools in recent symbolic algebraic computations. However, computing a Gröbner basis for the given polynomial ideal is not easy and it is not numerically stable if polynomials have inexact coefficients. In this paper, we study what we should get for computing a Gröbner basis with inexact coefficients and introduce a naive method to compute a Gröbner basis by reduced row echelon form, for the ideal generated by the given polynomial set having a priori errors on their coefficients.
Reconfigurable optical interconnections via dynamic computer-generated holograms
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang (Inventor); Zhou, Shaomin (Inventor)
1994-01-01
A system is proposed for optically providing one-to-many irregular interconnections, and strength-adjustable many-to-many irregular interconnections which may be provided with strengths (weights) w(sub ij) using multiple laser beams which address multiple holograms and means for combining the beams modified by the holograms to form multiple interconnections, such as a cross-bar switching network. The optical means for interconnection is based on entering a series of complex computer-generated holograms on an electrically addressed spatial light modulator for real-time reconfigurations, thus providing flexibility for interconnection networks for largescale practical use. By employing multiple sources and holograms, the number of interconnection patterns achieved is increased greatly.
ERIC Educational Resources Information Center
Ramsay, Judith; Terras, Melody M.
2015-01-01
The use of technology to support learning is well recognised. One generation ago a major strand of human--computer interaction research focussed on the development of forms of instruction in how to interact with computers. Today, however, the advanced usability of modern technologies has all but removed the presence of many user manuals. Learners,…
Algebraic grid generation with corner singularities
NASA Technical Reports Server (NTRS)
Vinokur, M.; Lombard, C. K.
1983-01-01
A simple noniterative algebraic procedure is presented for generating smooth computational meshes on a quadrilateral topology. Coordinate distribution and normal derivative are provided on all boundaries, one of which may include a slope discontinuity. The boundary conditions are sufficient to guarantee continuity of global meshes formed of joined patches generated by the procedure. The method extends to 3-D. The procedure involves a synthesis of prior techniques stretching functions, cubic blending functions, and transfinite interpolation - to which is added the functional form of the corner solution. The procedure introduces the concept of generalized blending, which is implemented as an automatic scaling of the boundary derivatives for effective interpolation. Some implications of the treatment at boundaries for techniques solving elliptic PDE's are discussed in an Appendix.
The influence of arc plasma parameters on the form of a welding pool
NASA Astrophysics Data System (ADS)
Frolov, V. Ya.; Toropchin, A. I.
2015-07-01
The influence of the Marangoni force on the form of a welding pool has been considered. Results of computer simulation of the processes of welding arc generation with a non-consumable tungsten electrode in inert gas are shown. The experimental results are reported and comparatively analyzed. The calculations were carried out in a package of applied programs at various currents.
Production of confluent hypergeometric beam by computer-generated hologram
NASA Astrophysics Data System (ADS)
Chen, Jiannong; Wang, Gang; Xu, Qinfeng
2011-02-01
Because of their spiral wave front, phase singularity, zero-intensity center and orbital angular momentum, dark hollow vortex beams have been found many applications in the field of atom optics such as atom cooling, atom transport and atom guiding. In this paper, a method for generating confluent hypergeometric beam by computer-generated hologram displayed on the spatial light modulator is presented. The hologram is formed by interference between a single ring Laguerre-Gaussian beam and a plane wave. The far-field Fraunhofer diffraction of this optical field transmitted from the hologram is the confluent hypergeometric beam. This beam is a circular symmetric beam which has a phase singularity, spiral wave front, zero-intensity center, and intrinsic orbital angular momentum. It is a new dark hollow vortex beam.
Equilibrium Temperature Profiles within Fission Product Waste Forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminski, Michael D.
2016-10-01
We studied waste form strategies for advanced fuel cycle schemes. Several options were considered for three waste streams with the following fission products: cesium and strontium, transition metals, and lanthanides. These three waste streams may be combined or disposed separately. The decay of several isotopes will generate heat that must be accommodated by the waste form, and this heat will affect the waste loadings. To help make an informed decision on the best option, we present computational data on the equilibrium temperature of glass waste forms containing a combination of these three streams.
On the computer analysis of structures and mechanical systems
NASA Technical Reports Server (NTRS)
Bennett, B. E.
1984-01-01
The governing equations for the analysis of open branch-chain mechanical systems are developed in a form suitable for implementation in a general purpose finite element computer program. Lagrange's form of d'Alembert's principle is used to derive the system mass matrix and force vector. The generalized coordinates are selected as the unconstrained relative degrees of freedom giving the position and orientation of each slave link with respect to their master link. Each slave link may have from zero to six degrees of freedom relative to the reference frames of its master link. A strategy for automatic generation of the system mass matrix and force vector is described.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
Simulation of isoelectro focusing processes. [stationary electrolysis of charged species
NASA Technical Reports Server (NTRS)
Palusinski, O. A.
1980-01-01
This paper presents the computer implementation of a model for the stationary electrolysis of two or more charged species. This has specific application to the technique of isoelectric focussing, in which the stationary electrolysis of ampholytes is used to generate a pH gradient useful for the separation of proteins, peptides and other biomolecules. The fundamental equations describing the process are given. These equations are transformed to a form suitable for digital computer implementation. Some results of computer simulation are described and compared to data obtained in the laboratory.
HIS-Based Support of Follow-Up Documentation – Concept and Implementation for Clinical Studies
Herzberg, S.; Fritz, F.; Rahbar, K.; Stegger, L.; Schäfers, M.; Dugas, M.
2011-01-01
Objective Follow-up data must be collected according to the protocol of each clinical study, i.e. at certain time points. Missing follow-up information is a critical problem and may impede or bias the analysis of study data and result in delays. Moreover, additional patient recruitment may be necessary due to incomplete follow-up data. Current electronic data capture (EDC) systems in clinical studies are usually separated from hospital information systems (HIS) and therefore can provide limited functionality to support clinical workflow. In two case studies, we assessed the feasibility of HIS-based support of follow-up documentation. Methods We have developed a data model and a HIS-based workflow to provide follow-up forms according to clinical study protocols. If a follow-up form was due, a database procedure created a follow-up event which was translated by a communication server into an HL7 message and transferred to the import interface of the clinical information system (CIS). This procedure generated the required follow-up form and enqueued a link to it in a work list of the relating study nurses and study physicians, respectively. Results A HIS-based follow-up system automatically generated follow-up forms as defined by a clinical study protocol. These forms were scheduled into work lists of study nurses and study physicians. This system was integrated into the clinical workflow of two clinical studies. In a study from nuclear medicine, each scenario from the test concept according to the protocol of the single photon emission computer tomography/computer tomography (SPECT/CT) study was simulated and each scenario passed the test. For a study in psychiatry, 128 follow-up forms were automatically generated within 27 weeks, on average five forms per week (maximum 12, minimum 1 form per week). Conclusion HIS-based support of follow-up documentation in clinical studies is technically feasible and can support compliance with study protocols. PMID:23616857
Bolintineanu, Dan S.; Rao, Rekha R.; Lechman, Jeremy B.; ...
2017-11-05
Here, we generate a wide range of models of proppant-packed fractures using discrete element simulations, and measure fracture conductivity using finite element flow simulations. This allows for a controlled computational study of proppant structure and its relationship to fracture conductivity and stress in the proppant pack. For homogeneous multi-layered packings, we observe the expected increase in fracture conductivity with increasing fracture aperture, while the stress on the proppant pack remains nearly constant. This is consistent with the expected behavior in conventional proppant-packed fractures, but the present work offers a novel quantitative analysis with an explicit geometric representation of the proppantmore » particles. In single-layered packings (i.e. proppant monolayers), there is a drastic increase in fracture conductivity as the proppant volume fraction decreases and open flow channels form. However, this also corresponds to a sharp increase in the mechanical stress on the proppant pack, as measured by the maximum normal stress relative to the side crushing strength of typical proppant particles. We also generate a variety of computational geometries that resemble highly heterogeneous proppant packings hypothesized to form during channel fracturing. In some cases, these heterogeneous packings show drastic improvements in conductivity with only moderate increase in the stress on the proppant particles, suggesting that in certain applications these structures are indeed optimal. We also compare our computer-generated structures to micro computed tomography imaging of a manually fractured laboratory-scale shale specimen, and find reasonable agreement in the geometric characteristics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolintineanu, Dan S.; Rao, Rekha R.; Lechman, Jeremy B.
Here, we generate a wide range of models of proppant-packed fractures using discrete element simulations, and measure fracture conductivity using finite element flow simulations. This allows for a controlled computational study of proppant structure and its relationship to fracture conductivity and stress in the proppant pack. For homogeneous multi-layered packings, we observe the expected increase in fracture conductivity with increasing fracture aperture, while the stress on the proppant pack remains nearly constant. This is consistent with the expected behavior in conventional proppant-packed fractures, but the present work offers a novel quantitative analysis with an explicit geometric representation of the proppantmore » particles. In single-layered packings (i.e. proppant monolayers), there is a drastic increase in fracture conductivity as the proppant volume fraction decreases and open flow channels form. However, this also corresponds to a sharp increase in the mechanical stress on the proppant pack, as measured by the maximum normal stress relative to the side crushing strength of typical proppant particles. We also generate a variety of computational geometries that resemble highly heterogeneous proppant packings hypothesized to form during channel fracturing. In some cases, these heterogeneous packings show drastic improvements in conductivity with only moderate increase in the stress on the proppant particles, suggesting that in certain applications these structures are indeed optimal. We also compare our computer-generated structures to micro computed tomography imaging of a manually fractured laboratory-scale shale specimen, and find reasonable agreement in the geometric characteristics.« less
NASA Technical Reports Server (NTRS)
Bateman, Don
1991-01-01
Wind shear detection status is presented in the form of view-graphs. The following subject areas are covered: second generation detection (Q-bias, gamma bias, temperature biases, maneuvering flight modulation, and altitude modulation); third generation wind shear detection (use wind shear computation to augment flight path and terrain alerts, modulation of alert thresholds based on wind/terrain data base, incorporate wind shear/terrain alert enhancements from predictive sensor data); and future research and development.
Real-time range generation for ladar hardware-in-the-loop testing
NASA Astrophysics Data System (ADS)
Olson, Eric M.; Coker, Charles F.
1996-05-01
Real-time closed loop simulation of LADAR seekers in a hardware-in-the-loop facility can reduce program risk and cost. This paper discusses an implementation of real-time range imagery generated in a synthetic environment at the Kinetic Kill Vehicle Hardware-in-the Loop facility at Eglin AFB, for the stimulation of LADAR seekers and algorithms. The computer hardware platform used was a Silicon Graphics Incorporated Onyx Reality Engine. This computer contains graphics hardware, and is optimized for generating visible or infrared imagery in real-time. A by-produce of the rendering process, in the form of a depth buffer, is generated from all objects in view during its rendering process. The depth buffer is an array of integer values that contributes to the proper rendering of overlapping objects and can be converted to range values using a mathematical formula. This paper presents an optimized software approach to the generation of the scenes, calculation of the range values, and outputting the range data for a LADAR seeker.
Assessment of gene order computing methods for Alzheimer's disease
2013-01-01
Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541
The National Grid Project: A system overview
NASA Technical Reports Server (NTRS)
Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel
1995-01-01
The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.
NASA Technical Reports Server (NTRS)
Dorband, John E.
1988-01-01
Sorting has long been used to organize data in preparation for further computation, but sort computation allows some types of computation to be performed during the sort. Sort aggregation and sort distribution are the two basic forms of sort computation. Sort aggregation generates an accumulative or aggregate result for each group of records and places this result in one of the records. An aggregate operation can be any operation that is both associative and commutative, i.e., any operation whose result does not depend on the order of the operands or the order in which the operations are performed. Sort distribution copies the value from a field of a specific record in a group into that field in every record of that group.
Recent applications of the transonic wing analysis computer code, TWING
NASA Technical Reports Server (NTRS)
Subramanian, N. R.; Holst, T. L.; Thomas, S. D.
1982-01-01
An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.
The benefits of computer-generated feedback for mathematics problem solving.
Fyfe, Emily R; Rittle-Johnson, Bethany
2016-07-01
The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.
Approximate Bayesian computation in large-scale structure: constraining the galaxy-halo connection
NASA Astrophysics Data System (ADS)
Hahn, ChangHoon; Vakili, Mohammadjavad; Walsh, Kilian; Hearin, Andrew P.; Hogg, David W.; Campbell, Duncan
2017-08-01
Standard approaches to Bayesian parameter inference in large-scale structure assume a Gaussian functional form (chi-squared form) for the likelihood. This assumption, in detail, cannot be correct. Likelihood free inferences such as approximate Bayesian computation (ABC) relax these restrictions and make inference possible without making any assumptions on the likelihood. Instead ABC relies on a forward generative model of the data and a metric for measuring the distance between the model and data. In this work, we demonstrate that ABC is feasible for LSS parameter inference by using it to constrain parameters of the halo occupation distribution (HOD) model for populating dark matter haloes with galaxies. Using specific implementation of ABC supplemented with population Monte Carlo importance sampling, a generative forward model using HOD and a distance metric based on galaxy number density, two-point correlation function and galaxy group multiplicity function, we constrain the HOD parameters of mock observation generated from selected 'true' HOD parameters. The parameter constraints we obtain from ABC are consistent with the 'true' HOD parameters, demonstrating that ABC can be reliably used for parameter inference in LSS. Furthermore, we compare our ABC constraints to constraints we obtain using a pseudo-likelihood function of Gaussian form with MCMC and find consistent HOD parameter constraints. Ultimately, our results suggest that ABC can and should be applied in parameter inference for LSS analyses.
Kindgen, Sarah; Wachtel, Herbert; Abrahamsson, Bertil; Langguth, Peter
2015-09-01
Disintegration of oral solid dosage forms is a prerequisite for drug dissolution and absorption and is to a large extent dependent on the pressures and hydrodynamic conditions in the solution that the dosage form is exposed to. In this work, the hydrodynamics in the PhEur/USP disintegration tester were investigated using computational fluid dynamics (CFD). Particle image velocimetry was used to validate the CFD predictions. The CFD simulations were performed with different Newtonian and non-Newtonian fluids, representing fasted and fed states. The results indicate that the current design and operating conditions of the disintegration test device, given by the pharmacopoeias, are not reproducing the in vivo situation. This holds true for the hydrodynamics in the disintegration tester that generates Reynolds numbers dissimilar to the reported in vivo situation. Also, when using homogenized US FDA meal, representing the fed state, too high viscosities and relative pressures are generated. The forces acting on the dosage form are too small for all fluids compared to the in vivo situation. The lack of peristaltic contractions, which generate hydrodynamics and shear stress in vivo, might be the major drawback of the compendial device resulting in the observed differences between predicted and in vivo measured hydrodynamics. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Technical Reports Server (NTRS)
Frost, W.; Long, B. H.; Turner, R. E.
1978-01-01
The guidelines are given in the form of design criteria relative to wind speed, wind shear, turbulence, wind direction, ice and snow loading, and other climatological parameters which include rain, hail, thermal effects, abrasive and corrosive effects, and humidity. This report is a presentation of design criteria in an engineering format which can be directly input to wind turbine generator design computations. Guidelines are also provided for developing specialized wind turbine generators or for designing wind turbine generators which are to be used in a special region of the United States.
Reconfigurable Optical Interconnections Via Dynamic Computer-Generated Holograms
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang (Inventor); Zhou, Shao-Min (Inventor)
1996-01-01
A system is presented for optically providing one-to-many irregular interconnections, and strength-adjustable many-to-many irregular interconnections which may be provided with strengths (weights) w(sub ij) using multiple laser beams which address multiple holograms and means for combining the beams modified by the holograms to form multiple interconnections, such as a cross-bar switching network. The optical means for interconnection is based on entering a series of complex computer-generated holograms on an electrically addressed spatial light modulator for real-time reconfigurations, thus providing flexibility for interconnection networks for large-scale practical use. By employing multiple sources and holograms, the number of interconnection patterns achieved is increased greatly.
Two Dimensional Mechanism for Insect Hovering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jane Wang, Z.
2000-09-04
Resolved computation of two dimensional insect hovering shows for the first time that a two dimensional hovering motion can generate enough lift to support a typical insect weight. The computation reveals a two dimensional mechanism of creating a downward dipole jet of counterrotating vortices, which are formed from leading and trailing edge vortices. The vortex dynamics further elucidates the role of the phase relation between the wing translation and rotation in lift generation and explains why the instantaneous forces can reach a periodic state after only a few strokes. The model predicts the lower limits in Reynolds number and amplitudemore » above which the averaged forces are sufficient. (c) 2000 The American Physical Society.« less
Packing microstructure and local density variations of experimental and computational pebble beds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auwerda, G. J.; Kloosterman, J. L.; Lathouwers, D.
2012-07-01
In pebble bed type nuclear reactors the fuel is contained in graphite pebbles, which form a randomly stacked bed with a non-uniform packing density. These variations can influence local coolant flow and power density and are a possible cause of hotspots. To analyse local density variations computational methods are needed that can generate randomly stacked pebble beds with a realistic packing structure on a pebble-to-pebble level. We first compare various properties of the local packing structure of a computed bed with those of an image made using computer aided X-ray tomography, looking at properties in the bulk of the bedmore » and near the wall separately. Especially for the bulk of the bed, properties of the computed bed show good comparison with the scanned bed and with literature, giving confidence our method generates beds with realistic packing microstructure. Results also show the packing structure is different near the wall than in the bulk of the bed, with pebbles near the wall forming ordered layers similar to hexagonal close packing. Next, variations in the local packing density are investigated by comparing probability density functions of the packing fraction of small clusters of pebbles throughout the bed. Especially near the wall large variations in local packing fractions exists, with a higher probability for both clusters of pebbles with low (<0.6) and high (>0.65) packing fraction, which could significantly affect flow rates and, together with higher power densities, could result in hotspots. (authors)« less
Toward an automated parallel computing environment for geosciences
NASA Astrophysics Data System (ADS)
Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping
2007-08-01
Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.
DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROTECTION
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...
HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics
NASA Astrophysics Data System (ADS)
Wiebusch, Martin
2015-10-01
This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.
PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 2: User's guide
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady
1990-01-01
A new computer code was developed to solve the two-dimensional or axisymmetric, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 2 is the User's Guide, and describes the program's general features, the input and output, the procedure for setting up initial conditions, the computer resource requirements, the diagnostic messages that may be generated, the job control language used to run the program, and several test cases.
Generation of anisotropy in turbulent flows subjected to rapid distortion
NASA Astrophysics Data System (ADS)
Clark, Timothy T.; Kurien, Susan; Rubinstein, Robert
2018-01-01
A computational tool for the anisotropic time-evolution of the spectral velocity correlation tensor is presented. We operate in the linear, rapid distortion limit of the mean-field-coupled equations. Each term of the equations is written in the form of an expansion to arbitrary order in the basis of irreducible representations of the SO(3) symmetry group. The computational algorithm for this calculation solves a system of coupled equations for the scalar weights of each generated anisotropic mode. The analysis demonstrates that rapid distortion rapidly but systematically generates higher-order anisotropic modes. To maintain a tractable computation, the maximum number of rotational modes to be used in a given calculation is specified a priori. The computed Reynolds stress converges to the theoretical result derived by Batchelor and Proudman [Quart. J. Mech. Appl. Math. 7, 83 (1954), 10.1093/qjmam/7.1.83] if a sufficiently large maximum number of rotational modes is utilized; more modes are required to recover the solution at later times. The emergence and evolution of the underlying multidimensional space of functions is presented here using a 64-mode calculation. Alternative implications for modeling strategies are discussed.
Generations of orthogonal surface coordinates
NASA Technical Reports Server (NTRS)
Blottner, F. G.; Moreno, J. B.
1980-01-01
Two generation methods were developed for three dimensional flows where the computational domain normal to the surface is small. With this restriction the coordinate system requires orthogonality only at the body surface. The first method uses the orthogonal condition in finite-difference form to determine the surface coordinates with the metric coefficients and curvature of the coordinate lines calculated numerically. The second method obtains analytical expressions for the metric coefficients and for the curvature of the coordinate lines.
Electrically generated eddies at an eightfold stagnation point within a nanopore
Sherwood, J. D.; Mao, M.; Ghosal, S.
2014-01-01
Electrically generated flows around a thin dielectric plate pierced by a cylindrical hole are computed numerically. The geometry represents that of a single nanopore in a membrane. When the membrane is uncharged, flow is due solely to induced charge electroosmosis, and eddies are generated by the high fields at the corners of the nanopore. These eddies meet at stagnation points. If the geometry is chosen correctly, the stagnation points merge to form a single stagnation point at which four streamlines cross at a point and eight eddies meet. PMID:25489206
Algorithms for the explicit computation of Penrose diagrams
NASA Astrophysics Data System (ADS)
Schindler, J. C.; Aguirre, A.
2018-05-01
An algorithm is given for explicitly computing Penrose diagrams for spacetimes of the form . The resulting diagram coordinates are shown to extend the metric continuously and nondegenerately across an arbitrary number of horizons. The method is extended to include piecewise approximations to dynamically evolving spacetimes using a standard hypersurface junction procedure. Examples generated by an implementation of the algorithm are shown for standard and new cases. In the appendix, this algorithm is compared to existing methods.
Spotting and designing promiscuous ligands for drug discovery.
Schneider, P; Röthlisberger, M; Reker, D; Schneider, G
2016-01-21
The promiscuous binding behavior of bioactive compounds forms a mechanistic basis for understanding polypharmacological drug action. We present the development and prospective application of a computational tool for identifying potential promiscuous drug-like ligands. In combination with computational target prediction methods, the approach provides a working concept for rationally designing such molecular structures. We could confirm the multi-target binding of a de novo generated compound in a proof-of-concept study relying on the new method.
How Insects Initiate Flight: Computational Analysis of a Damselfly in Takeoff Flight
NASA Astrophysics Data System (ADS)
Bode-Oke, Ayodeji; Zeyghami, Samane; Dong, Haibo; Flow Simulation Research Group Team
2017-11-01
Flight initiation is essential for survival in biological fliers and can be classified into jumping and non-jumping takeoffs. During jumping takeoffs, the legs generate most of the initial impulse. Whereas the wings generate most of the forces in non-jumping takeoffs, which are usually voluntary, slow, and stable. It is of interest to understand how non-jumping takeoffs occur and what strategies insects use to generate the required forces. Using a high fidelity computational fluid dynamics simulation, we identify the flow features and compute the wing aerodynamic forces to elucidate how flight forces are generated by a damselfly performing a non-jumping takeoff. Our results show that a damselfly generates about three times its bodyweight during the first half-stroke for liftoff while flapping through a steeply inclined stroke plane and slicing the air at high angles of attack. Consequently, a Leading Edge Vortex (LEV) is formed during both the downstroke and upstroke on all the four wings. The formation of the LEV, however, is inhibited in the subsequent upstrokes following takeoff. Accordingly, we observe a drastic reduction in the magnitude of the aerodynamic force, signifying the importance of LEV in augmenting force production. This work was supported by National Science Foundation [CBET-1313217] and Air Force Research Laboratory [FA9550-12-1-007].
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
NASA Astrophysics Data System (ADS)
Thomas, W. A.; McAnally, W. H., Jr.
1985-07-01
TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.
47 CFR 1.743 - Who may sign applications.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Complaints, Applications, Tariffs..., including revocation of station license pursuant to section 312(a)(1) of the Communications Act of 1934, as... formed by computer-generated electronic impulses. [28 FR 12450, Nov. 22, 1963, as amended at 53 FR 17193...
47 CFR 1.743 - Who may sign applications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Complaints, Applications, Tariffs..., including revocation of station license pursuant to section 312(a)(1) of the Communications Act of 1934, as... formed by computer-generated electronic impulses. [28 FR 12450, Nov. 22, 1963, as amended at 53 FR 17193...
47 CFR 1.743 - Who may sign applications.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Complaints, Applications, Tariffs..., including revocation of station license pursuant to section 312(a)(1) of the Communications Act of 1934, as... formed by computer-generated electronic impulses. [28 FR 12450, Nov. 22, 1963, as amended at 53 FR 17193...
DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...
41 CFR 109-38.903-50 - Reporting DOE motor vehicle data.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS... operating DOE-owned or commercially-leased motor vehicles shall prepare the following reports using SF 82... the report forms may be obtained by contacting the DPMO. (d) Personal computer generated reports are...
41 CFR 109-38.903-50 - Reporting DOE motor vehicle data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS... operating DOE-owned or commercially-leased motor vehicles shall prepare the following reports using SF 82... the report forms may be obtained by contacting the DPMO. (d) Personal computer generated reports are...
41 CFR 109-38.903-50 - Reporting DOE motor vehicle data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS... operating DOE-owned or commercially-leased motor vehicles shall prepare the following reports using SF 82... the report forms may be obtained by contacting the DPMO. (d) Personal computer generated reports are...
41 CFR 109-38.903-50 - Reporting DOE motor vehicle data.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS... operating DOE-owned or commercially-leased motor vehicles shall prepare the following reports using SF 82... the report forms may be obtained by contacting the DPMO. (d) Personal computer generated reports are...
1998-02-27
NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
1999-05-26
NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
Constructing Nucleon Operators on a Lattice for Form Factors with High Momentum Transfer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syritsyn, Sergey; Gambhir, Arjun S.; Musch, Bernhard U.
We present preliminary results of computing nucleon form factor at high momentum transfer using the 'boosted' or 'momentum' smearing. We use gauge configurations generated with N f = 2 + 1dynamical Wilson-clover fermions and study the connected as well as disconnected contributions to the nucleon form factors. Our initial results indicate that boosted smearing helps to improve the signal for nucleon correlators at high momentum. However, we also find evidence for large excited state contributions, which will likely require variational analysis to isolate the boosted nucleon ground state.
Interactive computer graphics - Why's, wherefore's and examples
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Carmichael, R. L.
1983-01-01
The benefits of using computer graphics in design are briefly reviewed. It is shown that computer graphics substantially aids productivity by permitting errors in design to be found immediately and by greatly reducing the cost of fixing the errors and the cost of redoing the process. The possibilities offered by computer-generated displays in terms of information content are emphasized, along with the form in which the information is transferred. The human being is ideally and naturally suited to dealing with information in picture format, and the content rate in communication with pictures is several orders of magnitude greater than with words or even graphs. Since science and engineering involve communicating ideas, concepts, and information, the benefits of computer graphics cannot be overestimated.
NASA Technical Reports Server (NTRS)
Goldfarb, W.; Carpenter, L. C.; Redhed, D. D.; Hansen, S. D.; Anderson, L. O.; Kawaguchi, A. S.
1973-01-01
The computing system design of IPAD is described and the requirements which form the basis for the system design are discussed. The system is presented in terms of a functional design description and technical design specifications. The functional design specifications give the detailed description of the system design using top-down structured programming methodology. Human behavioral characteristics, which specify the system design at the user interface, security considerations, and standards for system design, implementation, and maintenance are also part of the technical design specifications. Detailed specifications of the two most common computing system types in use by the major aerospace companies which could support the IPAD system design are presented. The report of a study to investigate migration of IPAD software between the two candidate 3rd generation host computing systems and from these systems to a 4th generation system is included.
Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming
2013-01-01
Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.
2013-01-01
Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169
On Writing and Reading Artistic Computational Ecosystems.
Antunes, Rui Filipe; Leymarie, Frederic Fol; Latham, William
2015-01-01
We study the use of the generative systems known as computational ecosystems to convey artistic and narrative aims. These are virtual worlds running on computers, composed of agents that trade units of energy and emulate cycles of life and behaviors adapted from biological life forms. In this article we propose a conceptual framework in order to understand these systems, which are involved in processes of authorship and interpretation that this investigation analyzes in order to identify critical instruments for artistic exploration. We formulate a model of narrative that we call system stories (after Mitchell Whitelaw), characterized by the dynamic network of material and conceptual processes that define these artefacts. They account for narrative constellations with multiple agencies from which meaning and messages emerge. Finally, we present three case studies to explore the potential of this model within an artistic and generative domain, arguing that this understanding expands and enriches the palette of the language of these systems.
Conformational analysis by intersection: CONAN.
Smellie, Andrew; Stanton, Robert; Henne, Randy; Teig, Steve
2003-01-15
As high throughput techniques in chemical synthesis and screening improve, more demands are placed on computer assisted design and virtual screening. Many of these computational methods require one or more three-dimensional conformations for molecules, creating a demand for a conformational analysis tool that can rapidly and robustly cover the low-energy conformational spaces of small molecules. A new algorithm of intersection is presented here, which quickly generates (on average <0.5 seconds/stereoisomer) a complete description of the low energy conformational space of a small molecule. The molecule is first decomposed into nonoverlapping nodes N (usually rings) and overlapping paths P with conformations (N and P) generated in an offline process. In a second step the node and path data are combined to form distinct conformers of the molecule. Finally, heuristics are applied after intersection to generate a small representative collection of conformations that span the conformational space. In a study of approximately 97,000 randomly selected molecules from the MDDR, results are presented that explore these conformations and their ability to cover low-energy conformational space. Copyright 2002 Wiley Periodicals, Inc. J Comput Chem 24: 10-20, 2003
VAMP: A computer program for calculating volume, area, and mass properties of aerospace vehicles
NASA Technical Reports Server (NTRS)
Norton, P. J.; Glatt, C. R.
1974-01-01
A computerized procedure developed for analyzing aerospace vehicles evaluates the properties of elemental surface areas with specified thickness by accumulating and combining them with arbitrarily specified mass elements to form a complete evaluation. Picture-like images of the geometric description are capable of being generated.
46 CFR 14.307 - Entries on certificate of discharge.
Code of Federal Regulations, 2012 CFR
2012-10-01
... them on the prescribed form with permanent ink or generating them from computer in the prescribed format; and shall sign them with permanent ink. The prescribed format for a certificate of discharge is...) Each mariner being discharged shall sign the certificate and both copies with permanent ink. (c) When...
46 CFR 14.307 - Entries on certificate of discharge.
Code of Federal Regulations, 2010 CFR
2010-10-01
... them on the prescribed form with permanent ink or generating them from computer in the prescribed format; and shall sign them with permanent ink. The prescribed format for a certificate of discharge is...) Each mariner being discharged shall sign the certificate and both copies with permanent ink. (c) When...
46 CFR 14.307 - Entries on certificate of discharge.
Code of Federal Regulations, 2014 CFR
2014-10-01
... them on the prescribed form with permanent ink or generating them from computer in the prescribed format, and must sign them with permanent ink. The prescribed format for a certificate of discharge is.... (b) Each mariner being discharged must sign the certificate and both copies with permanent ink. (c...
46 CFR 14.307 - Entries on certificate of discharge.
Code of Federal Regulations, 2011 CFR
2011-10-01
... them on the prescribed form with permanent ink or generating them from computer in the prescribed format; and shall sign them with permanent ink. The prescribed format for a certificate of discharge is...) Each mariner being discharged shall sign the certificate and both copies with permanent ink. (c) When...
46 CFR 14.307 - Entries on certificate of discharge.
Code of Federal Regulations, 2013 CFR
2013-10-01
... them on the prescribed form with permanent ink or generating them from computer in the prescribed format; and shall sign them with permanent ink. The prescribed format for a certificate of discharge is...) Each mariner being discharged shall sign the certificate and both copies with permanent ink. (c) When...
EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...
7 CFR 1710.401 - Loan application documents.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., Checklist for Electric Loan Application, or a computer generated equivalent as this list. (1) Transmittal... beginning date of the loan period and shall be the same as the date on the Financial and Statistical Report... headquarters facilities, Form 740g need not be submitted. (5) Financial and statistical report. Distribution...
7 CFR 1710.401 - Loan application documents.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., Checklist for Electric Loan Application, or a computer generated equivalent as this list. (1) Transmittal... beginning date of the loan period and shall be the same as the date on the Financial and Statistical Report... headquarters facilities, Form 740g need not be submitted. (5) Financial and statistical report. Distribution...
7 CFR 1710.401 - Loan application documents.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., Checklist for Electric Loan Application, or a computer generated equivalent as this list. (1) Transmittal... beginning date of the loan period and shall be the same as the date on the Financial and Statistical Report... headquarters facilities, Form 740g need not be submitted. (5) Financial and statistical report. Distribution...
Diffractive micro-optical element with nonpoint response
NASA Astrophysics Data System (ADS)
Soifer, Victor A.; Golub, Michael A.
1993-01-01
Common-use diffractive lenses have microrelief zones in the form of simple rings that provide only an optical power but do not contain any image information. They have a point-image response under point-source illumination. We must use a more complicated non-point response to focus a light beam into different light marks, letter-type images as well as for optical pattern recognition. The current presentation describes computer generation of diffractive micro- optical elements with complicated curvilinear zones of a regular piecewise-smooth structure and grey-level or staircase phase microrelief. The manufacture of non-point response elements uses the steps of phase-transfer calculation and orthogonal-scan masks generation or lithographic glass etching. Ray-tracing method is shown to be applicable in this task. Several working samples of focusing optical elements generated by computer and photolithography are presented. Using the experimental results we discuss here such applications as laser branding.
Behavioral and computational aspects of language and its acquisition
NASA Astrophysics Data System (ADS)
Edelman, Shimon; Waterfall, Heidi
2007-12-01
One of the greatest challenges facing the cognitive sciences is to explain what it means to know a language, and how the knowledge of language is acquired. The dominant approach to this challenge within linguistics has been to seek an efficient characterization of the wealth of documented structural properties of language in terms of a compact generative grammar-ideally, the minimal necessary set of innate, universal, exception-less, highly abstract rules that jointly generate all and only the observed phenomena and are common to all human languages. We review developmental, behavioral, and computational evidence that seems to favor an alternative view of language, according to which linguistic structures are generated by a large, open set of constructions of varying degrees of abstraction and complexity, which embody both form and meaning and are acquired through socially situated experience in a given language community, by probabilistic learning algorithms that resemble those at work in other cognitive modalities.
An algorithmic approach to solving polynomial equations associated with quantum circuits
NASA Astrophysics Data System (ADS)
Gerdt, V. P.; Zinin, M. V.
2009-12-01
In this paper we present two algorithms for reducing systems of multivariate polynomial equations over the finite field F 2 to the canonical triangular form called lexicographical Gröbner basis. This triangular form is the most appropriate for finding solutions of the system. On the other hand, the system of polynomials over F 2 whose variables also take values in F 2 (Boolean polynomials) completely describes the unitary matrix generated by a quantum circuit. In particular, the matrix itself can be computed by counting the number of solutions (roots) of the associated polynomial system. Thereby, efficient construction of the lexicographical Gröbner bases over F 2 associated with quantum circuits gives a method for computing their circuit matrices that is alternative to the direct numerical method based on linear algebra. We compare our implementation of both algorithms with some other software packages available for computing Gröbner bases over F 2.
Topology and boundary shape optimization as an integrated design tool
NASA Technical Reports Server (NTRS)
Bendsoe, Martin Philip; Rodrigues, Helder Carrico
1990-01-01
The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.
Sound Emission of Rotor Induced Deformations of Generator Casings
NASA Technical Reports Server (NTRS)
Polifke, W.; Mueller, B.; Yee, H. C.; Mansour, Nagi (Technical Monitor)
2001-01-01
The casing of large electrical generators can be deformed slightly by the rotor's magnetic field. The sound emission produced by these periodic deformations, which could possibly exceed guaranteed noise emission limits, is analysed analytically and numerically. From the deformation of the casing, the normal velocity of the generator's surface is computed. Taking into account the corresponding symmetry, an analytical solution for the acoustic pressure outside the generator is round in terms of the Hankel function of second order. The normal velocity or the generator surface provides the required boundary condition for the acoustic pressure and determines the magnitude of pressure oscillations. For the numerical simulation, the nonlinear 2D Euler equations are formulated In a perturbation form for low Mach number Computational Aeroacoustics (CAA). The spatial derivatives are discretized by the classical sixth-order central interior scheme and a third-order boundary scheme. Spurious high frequency oscillations are damped by a characteristic-based artificial compression method (ACM) filter. The time derivatives are approximated by the classical 4th-order Runge-Kutta method. The numerical results are In excellent agreement with the analytical solution.
Braun, Doris E; Gelbrich, Thomas; Wurst, Klaus; Griesser, Ulrich J
2016-06-01
New polymorphs of thymine emerged in an experimental search for solid forms, which was guided by the computationally generated crystal energy landscape. Three of the four anhydrates (AH) are homeoenergetic ( A° - C ) and their packing modes differ only in the location of oxygen and hydrogen atoms. AHs A° and B are ordered phases, whereas AH C shows disorder (X-ray diffuse scattering). Anhydrates AHs A° and B are ordered phases, whereas AH C shows disorder (X-ray diffuse scattering). Analysis of the crystal energy landscape for alternative AH C hydrogen bonded ribbon motifs identified a number of different packing modes, whose 3D structures were calculated to deviate by less than 0.24 kJ mol -1 in lattice energy. These structures provide models for stacking faults. The three anhydrates A ° - C show strong similarity in their powder X-ray diffraction, thermoanalytical and spectroscopic (IR and Raman) characteristics. The already known anhydrate AH A ° was identified as the thermodynamically most stable form at ambient conditions; AH B and AH C are metastable but show high kinetic stability. The hydrate of thymine is stable only at water activities ( a w ) > 0.95 at temperatures ≤ 25 °C. It was found to be a stoichiometric hydrate despite being a channel hydrate with an unusual water:thymine ratio of 0.8:1. Depending on the dehydration conditions, either AH C or AH D is obtained. The hydrate is the only known precursor to AH D . This study highlights the value and complementarity of simultaneous explorations of computationally and experimentally generated solid form landscapes of a small molecule anhydrate ↔ hydrate system.
Braun, Doris E.; Gelbrich, Thomas; Wurst, Klaus; Griesser, Ulrich J.
2017-01-01
New polymorphs of thymine emerged in an experimental search for solid forms, which was guided by the computationally generated crystal energy landscape. Three of the four anhydrates (AH) are homeoenergetic (A° – C) and their packing modes differ only in the location of oxygen and hydrogen atoms. AHs A° and B are ordered phases, whereas AH C shows disorder (X-ray diffuse scattering). Anhydrates AHs A° and B are ordered phases, whereas AH C shows disorder (X-ray diffuse scattering). Analysis of the crystal energy landscape for alternative AH C hydrogen bonded ribbon motifs identified a number of different packing modes, whose 3D structures were calculated to deviate by less than 0.24 kJ mol–1 in lattice energy. These structures provide models for stacking faults. The three anhydrates A° – C show strong similarity in their powder X-ray diffraction, thermoanalytical and spectroscopic (IR and Raman) characteristics. The already known anhydrate AH A° was identified as the thermodynamically most stable form at ambient conditions; AH B and AH C are metastable but show high kinetic stability. The hydrate of thymine is stable only at water activities (aw) > 0.95 at temperatures ≤ 25 °C. It was found to be a stoichiometric hydrate despite being a channel hydrate with an unusual water:thymine ratio of 0.8:1. Depending on the dehydration conditions, either AH C or AH D is obtained. The hydrate is the only known precursor to AH D. This study highlights the value and complementarity of simultaneous explorations of computationally and experimentally generated solid form landscapes of a small molecule anhydrate ↔ hydrate system. PMID:28663717
BIOCOMPUTATION: some history and prospects.
Cull, Paul
2013-06-01
At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Managing Data From Signal-Propagation Experiments
NASA Technical Reports Server (NTRS)
Kantak, A. V.
1989-01-01
Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.
Toutounji, Hazem; Pipa, Gordon
2014-01-01
It is a long-established fact that neuronal plasticity occupies the central role in generating neural function and computation. Nevertheless, no unifying account exists of how neurons in a recurrent cortical network learn to compute on temporally and spatially extended stimuli. However, these stimuli constitute the norm, rather than the exception, of the brain's input. Here, we introduce a geometric theory of learning spatiotemporal computations through neuronal plasticity. To that end, we rigorously formulate the problem of neural representations as a relation in space between stimulus-induced neural activity and the asymptotic dynamics of excitable cortical networks. Backed up by computer simulations and numerical analysis, we show that two canonical and widely spread forms of neuronal plasticity, that is, spike-timing-dependent synaptic plasticity and intrinsic plasticity, are both necessary for creating neural representations, such that these computations become realizable. Interestingly, the effects of these forms of plasticity on the emerging neural code relate to properties necessary for both combating and utilizing noise. The neural dynamics also exhibits features of the most likely stimulus in the network's spontaneous activity. These properties of the spatiotemporal neural code resulting from plasticity, having their grounding in nature, further consolidate the biological relevance of our findings. PMID:24651447
Numerical Simulation Of Cutting Of Gear Teeth
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Huston, Ronald L.; Mavriplis, Dimitrios
1994-01-01
Shapes of gear teeth produced by gear cutters of specified shape simulated computationally, according to approach based on principles of differential geometry. Results of computer simulation displayed as computer graphics and/or used in analyses of design, manufacturing, and performance of gears. Applicable to both standard and non-standard gear-tooth forms. Accelerates and facilitates analysis of alternative designs of gears and cutters. Simulation extended to study generation of surfaces other than gears. Applied to cams, bearings, and surfaces of arbitrary rolling elements as well as to gears. Possible to develop analogous procedures for simulating manufacture of skin surfaces like automobile fenders, airfoils, and ship hulls.
Esthetic Education of Children with Special Needs by Means of Computer Art
ERIC Educational Resources Information Center
Salakhov, Rasykh F.; Salakhova, Rada I.; Nasibullov, Ramis R.
2016-01-01
Thematic justification of the problem under study: growth of social stratification in the present context of social and economic transformations and emerging forms of social inequality, which is creating serious obstacles for the younger generation's social adaptation, specifically among those social groups that are dealing with difficulties due…
Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing
ERIC Educational Resources Information Center
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt
2012-01-01
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.
ERIC Educational Resources Information Center
Danziger, Pamela N.
This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…
Night Attack Workload Steering Group. Volume 3. Simulation and Human Factors Subgroup
1982-06-01
information intepretation . The second is the use of pictorial formats or computer generated displays that combine many present-day displays into a small number...base exists in any form (digital, film , or model) which supports the wide area, long track, low level requirements levied by night attack training
Computer-generated holograms and diffraction gratings in optical security applications
NASA Astrophysics Data System (ADS)
Stepien, Pawel J.
2000-04-01
The term 'computer generated hologram' (CGH) describes a diffractive structure strictly calculated and recorded to diffract light in a desired way. The CGH surface profile is a result of the wavefront calculation rather than of interference. CGHs are able to form 2D and 3D images. Optically, variable devices (OVDs) composed of diffractive gratings are often used in security applications. There are various types of optically and digitally recorded gratings in security applications. Grating based OVDs are used to record bright 2D images with limited range of cinematic effects. These effects result form various orientations or densities of recorded gratings. It is difficult to record high quality OVDs of 3D objects using gratings. Stereo grams and analogue rainbow holograms offer 3D imaging, but they are darker and have lower resolution than grating OVDs. CGH based OVDs contains unlimited range of cinematic effects and high quality 3D images. Images recorded using CGHs are usually more noisy than grating based OVDs, because of numerical inaccuracies in CGH calculation and mastering. CGH based OVDs enable smooth integration of hidden and machine- readable features within an OVD design.
Improved Measurement of Ejection Velocities From Craters Formed in Sand
NASA Technical Reports Server (NTRS)
Cintala, Mark J.; Byers, Terry; Cardenas, Francisco; Montes, Roland; Potter, Elliot E.
2014-01-01
A typical impact crater is formed by two major processes: compression of the target (essentially equivalent to a footprint in soil) and ejection of material. The Ejection-Velocity Measurement System (EVMS) in the Experimental Impact Laboratory has been used to study ejection velocities from impact craters formed in sand since the late 1990s. The original system used an early-generation Charge-Coupled Device (CCD) camera; custom-written software; and a complex, multicomponent optical system to direct laser light for illumination. Unfortunately, the electronic equipment was overtaken by age, and the software became obsolete in light of improved computer hardware.
Linking entanglement and discrete anomaly
NASA Astrophysics Data System (ADS)
Hung, Ling-Yan; Wu, Yong-Shi; Zhou, Yang
2018-05-01
In 3 d Chern-Simons theory, there is a discrete one-form symmetry, whose symmetry group is isomorphic to the center of the gauge group. We study the `t Hooft anomaly associated to this discrete one-form symmetry in theories with generic gauge groups, A, B, C, D-types. We propose to detect the discrete anomaly by computing the Hopf state entanglement in the subspace spanned by the symmetry generators and develop a systematical way based on the truncated modular S matrix. We check our proposal for many examples.
Floating-point function generation routines for 16-bit microcomputers
NASA Technical Reports Server (NTRS)
Mackin, M. A.; Soeder, J. F.
1984-01-01
Several computer subroutines have been developed that interpolate three types of nonanalytic functions: univariate, bivariate, and map. The routines use data in floating-point form. However, because they are written for use on a 16-bit Intel 8086 system with an 8087 mathematical coprocessor, they execute as fast as routines using data in scaled integer form. Although all of the routines are written in assembly language, they have been implemented in a modular fashion so as to facilitate their use with high-level languages.
NASA Astrophysics Data System (ADS)
Oda, Akifumi; Fukuyoshi, Shuichi
2015-06-01
The GADV hypothesis is a form of the protein world hypothesis, which suggests that life originated from proteins (Lacey et al. 1999; Ikehara 2002; Andras 2006). In the GADV hypothesis, life is thought to have originated from primitive proteins constructed of only glycine, alanine, aspartic acid, and valine ([GADV]-proteins). In this study, the three-dimensional (3D) conformations of randomly generated short [GADV]-peptides were computationally investigated using replica-exchange molecular dynamics (REMD) simulations (Sugita and Okamoto 1999). Because the peptides used in this study consisted of only 20 residues each, they could not form certain 3D structures. However, the conformational tendencies of the peptides were elucidated by analyzing the conformational ensembles generated by REMD simulations. The results indicate that secondary structures can be formed in several randomly generated [GADV]-peptides. A long helical structure was found in one of the hydrophobic peptides, supporting the conjecture of the GADV hypothesis that many peptides aggregated to form peptide multimers with enzymatic activity in the primordial soup. In addition, these results indicate that REMD simulations can be used for the structural investigation of short peptides.
A climate responsive urban design tool: a platform to improve energy efficiency in a dry hot climate
NASA Astrophysics Data System (ADS)
El Dallal, Norhan; Visser, Florentine
2017-09-01
In the Middle East and North Africa (MENA) region, new urban developments should address the climatic conditions to improve outdoor comfort and to reduce the energy consumption of buildings. This article describes a design tool that supports climate responsive design for a dry hot climate. The approach takes the climate as an initiator for the conceptual urban form with a more energy-efficient urban morphology. The methodology relates the different passive strategies suitable for major climate conditions in MENA region (dry-hot) to design parameters that create the urban form. This parametric design approach is the basis for a tool that generates conceptual climate responsive urban forms so as to assist the urban designer early in the design process. Various conceptual scenarios, generated by a computational model, are the results of the proposed platform. A practical application of the approach is conducted on a New Urban Community in Aswan (Egypt), showing the economic feasibility of the resulting urban form and morphology, and the proposed tool.
NASA Astrophysics Data System (ADS)
Ehrentreich, F.; Dietze, U.; Meyer, U.; Abbas, S.; Schulz, H.
1995-04-01
It is a main task within the SpecInfo-Project to develop interpretation tools that can handle a great deal more of the complicated, more specific spectrum-structure-correlations. In the first step the empirical knowledge about the assignment of structural groups and their characteristic IR-bands has been collected from literature and represented in a computer readable well-structured form. Vague, verbal rules are managed by introduction of linguistic variables. The next step was the development of automatic rule generating procedures. We had combined and enlarged the IDIOTS algorithm with the algorithm by Blaffert relying on set theory. The procedures were successfully applied to the SpecInfo database. The realization of the preceding items is a prerequisite for the improvement of the computerized structure elucidation procedure.
Neural classification of the selected family of butterflies
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Mueller, W.; Górna, K.; Okoń, P.
2017-07-01
There have been noticed growing explorers' interest in drawing conclusions based on information of data coded in a graphic form. The neuronal identification of pictorial data, with special emphasis on both quantitative and qualitative analysis, is more frequently utilized to gain and deepen the empirical data knowledge. Extraction and then classification of selected picture features, such as color or surface structure, enables one to create computer tools in order to identify these objects presented as, for example, digital pictures. The work presents original computer system "Processing the image v.1.0" designed to digitalize pictures on the basis of color criterion. The system has been applied to generate a reference learning file for generating the Artificial Neural Network (ANN) to identify selected kinds of butterflies from the Papilionidae family.
The Science of Computing: Virtual Memory
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1986-01-01
In the March-April issue, I described how a computer's storage system is organized as a hierarchy consisting of cache, main memory, and secondary memory (e.g., disk). The cache and main memory form a subsystem that functions like main memory but attains speeds approaching cache. What happens if a program and its data are too large for the main memory? This is not a frivolous question. Every generation of computer users has been frustrated by insufficient memory. A new line of computers may have sufficient storage for the computations of its predecessor, but new programs will soon exhaust its capacity. In 1960, a longrange planning committee at MIT dared to dream of a computer with 1 million words of main memory. In 1985, the Cray-2 was delivered with 256 million words. Computational physicists dream of computers with 1 billion words. Computer architects have done an outstanding job of enlarging main memories yet they have never kept up with demand. Only the shortsighted believe they can.
Simple and powerful visual stimulus generator.
Kremlácek, J; Kuba, M; Kubová, Z; Vít, F
1999-02-01
We describe a cheap, simple, portable and efficient approach to visual stimulation for neurophysiology which does not need any special hardware equipment. The method based on an animation technique uses the FLI autodesk animator format. This form of the animation is replayed by a special program ('player') providing synchronisation pulses toward recording system via parallel port. The 'player is running on an IBM compatible personal computer under MS-DOS operation system and stimulus is displayed on a VGA computer monitor. Various stimuli created with this technique for visual evoked potentials (VEPs) are presented.
Computing 3-D steady supersonic flow via a new Lagrangian approach
NASA Technical Reports Server (NTRS)
Loh, C. Y.; Liou, M.-S.
1993-01-01
The new Lagrangian method introduced by Loh and Hui (1990) is extended for 3-D steady supersonic flow computation. Details of the conservation form, the implementation of the local Riemann solver, and the Godunov and the high resolution TVD schemes are presented. The new approach is robust yet accurate, capable of handling complicated geometry and reactions between discontinuous waves. It keeps all the advantages claimed in the 2-D method of Loh and Hui, e.g., crisp resolution for a slip surface (contact discontinuity) and automatic grid generation along the stream.
Moments of inclination error distribution computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program is described which calculates orbital inclination error statistics using a closed-form solution. This solution uses a data base of trajectory errors from actual flights to predict the orbital inclination error statistics. The Scott flight history data base consists of orbit insertion errors in the trajectory parameters - altitude, velocity, flight path angle, flight azimuth, latitude and longitude. The methods used to generate the error statistics are of general interest since they have other applications. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included.
Continuous Variable Cluster State Generation over the Optical Spatial Mode Comb
Pooser, Raphael C.; Jing, Jietai
2014-10-20
One way quantum computing uses single qubit projective measurements performed on a cluster state (a highly entangled state of multiple qubits) in order to enact quantum gates. The model is promising due to its potential scalability; the cluster state may be produced at the beginning of the computation and operated on over time. Continuous variables (CV) offer another potential benefit in the form of deterministic entanglement generation. This determinism can lead to robust cluster states and scalable quantum computation. Recent demonstrations of CV cluster states have made great strides on the path to scalability utilizing either time or frequency multiplexingmore » in optical parametric oscillators (OPO) both above and below threshold. The techniques relied on a combination of entangling operators and beam splitter transformations. Here we show that an analogous transformation exists for amplifiers with Gaussian inputs states operating on multiple spatial modes. By judicious selection of local oscillators (LOs), the spatial mode distribution is analogous to the optical frequency comb consisting of axial modes in an OPO cavity. We outline an experimental system that generates cluster states across the spatial frequency comb which can also scale the amount of quantum noise reduction to potentially larger than in other systems.« less
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
GENIE(++): A Multi-Block Structured Grid System
NASA Technical Reports Server (NTRS)
Williams, Tonya; Nadenthiran, Naren; Thornburg, Hugh; Soni, Bharat K.
1996-01-01
The computer code GENIE++ is a continuously evolving grid system containing a multitude of proven geometry/grid techniques. The generation process in GENIE++ is based on an earlier version. The process uses several techniques either separately or in combination to quickly and economically generate sculptured geometry descriptions and grids for arbitrary geometries. The computational mesh is formed by using an appropriate algebraic method. Grid clustering is accomplished with either exponential or hyperbolic tangent routines which allow the user to specify a desired point distribution. Grid smoothing can be accomplished by using an elliptic solver with proper forcing functions. B-spline and Non-Uniform Rational B-splines (NURBS) algorithms are used for surface definition and redistribution. The built in sculptured geometry definition with desired distribution of points, automatic Bezier curve/surface generation for interior boundaries/surfaces, and surface redistribution is based on NURBS. Weighted Lagrance/Hermite transfinite interpolation methods, interactive geometry/grid manipulation modules, and on-line graphical visualization of the generation process are salient features of this system which result in a significant time savings for a given geometry/grid application.
Organizer and axes formation as a self-organizing process.
Meinhardt, H
2001-01-01
It is a widely held view that axis formation is based essentially on pre-localized determinants. However, the robustness of early development, the pattern regulation observed after experimental interferences and the existence of systems that don't require maternal determinants suggest that self-regulating pattern forming systems are also involved. A model is proposed that allows axes formation by a chain of reactions based on local self-enhancement and long-range inhibition. Their appropriate linkage ensures that the intermediary patterns emerge in the correct sequence and have the correct spatial relation to each other. Specifically, the model comprises the following events: the generation of a pole by a pattern-forming process, the formation of a second organizer eccentric to the pole (e.g. the Nieuwkoop center), the ecto-meso-endo subdivision, the generation of the Spemann-Mangold organizer with its anterior-posterior subdivision under the influence of the Nieuwkoop center, the conversion of the Spemann-Mangold organizer (a hot spot) into the notochord (a hot stripe), and the marking of the left side of the organism by a patterning reaction influenced by the midline. The pattern forming reactions do not depend on but can make use of maternally pre-localized determinants or asymmetries. Comparison with known genes and molecules reveals that many of the expected ingredients are present. Computer simulations show that the model accounts for many regulatory features reported in the literature. The computer simulations are available in an animated form at.
A Framework to Learn Physics from Atomically Resolved Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vlcek, L.; Maksov, A.; Pan, M.
Here, we present a generalized framework for physics extraction, i.e., knowledge, from atomically resolved images, and show its utility by applying it to a model system of segregation of chalcogen atoms in an FeSe 0.45Te 0.55 superconductor system. We emphasize that the framework can be used for any imaging data for which a generative physical model exists. Consider that a generative physical model can produce a very large number of configurations, not all of which are observable. By applying a microscope function to a sub-set of this generated data, we form a simulated dataset on which statistics can be computed.
Development of Three-Dimensional Object Completion in Infancy
ERIC Educational Resources Information Center
Soska, Kasey C.; Johnson, Scott P.
2008-01-01
Three-dimensional (3D) object completion was investigated by habituating 4- and 6-month-old infants (n = 24 total) with a computer-generated wedge stimulus that pivoted 15[degrees], providing only a limited view. Two displays, rotating 360[degrees], were then shown: a complete, solid volume and an incomplete, hollow form composed only of the sides…
75 FR 38565 - Proposed collection; comment request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... provided claim form that he or she did not work on any day claimed and did not receive income such as.... Further, under 20 CFR 322.4(b), when there is a question raised as to whether or not remuneration is... benefits from a railroad employer. The request is generated as a result of a computer match that compares...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-26
... Federal Register at 73 FR 51276 on September 2, 2008, by the Department of Commerce. This FIPS requirement was withdrawn by the Secretary of Commerce because it was obsolete and had not been updated to adopt...) 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives...
Using the Item Response Theory (IRT) for Educational Evaluation through Games
ERIC Educational Resources Information Center
Euzébio Batista, Marcelo Henrique; Victória Barbosa, Jorge Luis; da Rosa Tavares, João Elison; Hackenhaar, Jonathan Luis
2013-01-01
This article shows the application of Item Response Theory (IRT) for educational evaluation using games. The article proposes a computational model to create user profiles, called Psychometric Profile Generator (PPG). PPG uses the IRT mathematical model for exploring the levels of skills and behaviors in the form of items and/or stimuli. The model…
Impact of iPads on Break-Time in Primary Schools--A Danish Context
ERIC Educational Resources Information Center
Schilhab, Theresa
2017-01-01
Today, technology in the form of tablet computers (e.g. iPads) is crucial as a tool for learning and education. Tablets support educational activities such as archiving, word processing, and generation of academic products. They also connect with the Internet, providing access to news, encyclopaedic entries, and e-books. In addition, tablets have…
A Framework for Representing and Jointly Reasoning over Linguistic and Non-Linguistic Knowledge
ERIC Educational Resources Information Center
Murugesan, Arthi
2009-01-01
Natural language poses several challenges to developing computational systems for modeling it. Natural language is not a precise problem but is rather ridden with a number of uncertainties in the form of either alternate words or interpretations. Furthermore, natural language is a generative system where the problem size is potentially infinite.…
Johnson, Kevin B; Ravich, William J; Cowan, John A
2004-09-01
Computer-based software to record histories, physical exams, and progress or procedure notes, known as computer-based documentation (CBD) software, has been touted as an important addition to the electronic health record. The functionality of CBD systems has remained static over the past 30 years, which may have contributed to the limited adoption of these tools. Early users of this technology, who have tried multiple products, may have insight into important features to be considered in next-generation CBD systems. We conducted a cross-sectional, observational study of the clinical working group membership of the American Medical Informatics Association (AMIA) to generate a set of features that might improve adoption of next-generation systems. The study was conducted online over a 4-month period; 57% of the working group members completed the survey. As anticipated, CBD tool use was higher (53%) in this population than in the US physician offices. The most common methods of data entry employed keyboard and mouse, with agreement that these modalities worked well. Many respondents had experience with pre-printed data collection forms before interacting with a CBD system. Respondents noted that CBD improved their ability to document large amounts of information, allowed timely sharing of information, enhanced patient care, and enhanced medical information with other clinicians (all P < 0.001). Respondents also noted some important but absent features in CBD, including the ability to add images, get help, and generate billing information. The latest generation of CBD systems is being used successfully by early adopters, who find that these tools confer many advantages over the approaches to documentation that they replaced. These users provide insights that may improve successive generations of CBD tools. Additional surveys of CBD non-users and failed adopters will be necessary to provide other useful insights that can address barriers to the adoption of CBD by less computer literate physicians.
De Novo Design and Experimental Characterization of Ultrashort Self-Associating Peptides
Xue, Bo; Robinson, Robert C.; Hauser, Charlotte A. E.; Floudas, Christodoulos A.
2014-01-01
Self-association is a common phenomenon in biology and one that can have positive and negative impacts, from the construction of the architectural cytoskeleton of cells to the formation of fibrils in amyloid diseases. Understanding the nature and mechanisms of self-association is important for modulating these systems and in creating biologically-inspired materials. Here, we present a two-stage de novo peptide design framework that can generate novel self-associating peptide systems. The first stage uses a simulated multimeric template structure as input into the optimization-based Sequence Selection to generate low potential energy sequences. The second stage is a computational validation procedure that calculates Fold Specificity and/or Approximate Association Affinity (K*association) based on metrics that we have devised for multimeric systems. This framework was applied to the design of self-associating tripeptides using the known self-associating tripeptide, Ac-IVD, as a structural template. Six computationally predicted tripeptides (Ac-LVE, Ac-YYD, Ac-LLE, Ac-YLD, Ac-MYD, Ac-VIE) were chosen for experimental validation in order to illustrate the self-association outcomes predicted by the three metrics. Self-association and electron microscopy studies revealed that Ac-LLE formed bead-like microstructures, Ac-LVE and Ac-YYD formed fibrillar aggregates, Ac-VIE and Ac-MYD formed hydrogels, and Ac-YLD crystallized under ambient conditions. An X-ray crystallographic study was carried out on a single crystal of Ac-YLD, which revealed that each molecule adopts a β-strand conformation that stack together to form parallel β-sheets. As an additional validation of the approach, the hydrogel-forming sequences of Ac-MYD and Ac-VIE were shuffled. The shuffled sequences were computationally predicted to have lower K*association values and were experimentally verified to not form hydrogels. This illustrates the robustness of the framework in predicting self-associating tripeptides. We expect that this enhanced multimeric de novo peptide design framework will find future application in creating novel self-associating peptides based on unnatural amino acids, and inhibitor peptides of detrimental self-aggregating biological proteins. PMID:25010703
Numerical Investigations of the Benchmark Supercritical Wing in Transonic Flow
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Heeg, Jennifer; Biedron, Robert T.
2017-01-01
This paper builds on the computational aeroelastic results published previously and generated in support of the second Aeroelastic Prediction Workshop for the NASA Benchmark Supercritical Wing (BSCW) configuration. The computational results are obtained using FUN3D, an unstructured grid Reynolds-Averaged Navier-Stokes solver developed at the NASA Langley Research Center. The analysis results show the effects of the temporal and spatial resolution, the coupling scheme between the flow and the structural solvers, and the initial excitation conditions on the numerical flutter onset. Depending on the free stream condition and the angle of attack, the above parameters do affect the flutter onset. Two conditions are analyzed: Mach 0.74 with angle of attack 0 and Mach 0.85 with angle of attack 5. The results are presented in the form of the damping values computed from the wing pitch angle response as a function of the dynamic pressure or in the form of dynamic pressure as a function of the Mach number.
High-Resolution Large-Field-of-View Three-Dimensional Hologram Display System and Method Thereof
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin (Inventor); Mintz, Frederick W. (Inventor); Tsou, Peter (Inventor); Bryant, Nevin A. (Inventor)
2001-01-01
A real-time, dynamic, free space-virtual reality, 3-D image display system is enabled by using a unique form of Aerogel as the primary display media. A preferred embodiment of this system comprises a 3-D mosaic topographic map which is displayed by fusing four projected hologram images. In this embodiment, four holographic images are projected from four separate holograms. Each holographic image subtends a quadrant of the 4(pi) solid angle. By fusing these four holographic images, a static 3-D image such as a featured terrain map would be visible for 360 deg in the horizontal plane and 180 deg in the vertical plane. An input, either acquired by 3-D image sensor or generated by computer animation, is first converted into a 2-D computer generated hologram (CGH). This CGH is then downloaded into large liquid crystal (LC) panel. A laser projector illuminates the CGH-filled LC panel and generates and displays a real 3-D image in the Aerogel matrix.
Computational design of high efficiency release targets for use at ISOL facilities
NASA Astrophysics Data System (ADS)
Liu, Y.; Alton, G. D.; Middleton, J. W.
1999-06-01
This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated vitreous carbon fiber (RVCF) or carbon-bonded-carbon-fiber (CBCF) to form highly permeable composite target matrices. Computational studies which simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected targets and thermal analyses of temperature distributions within a prototype target/heat-sink system subjected to primary ion beam irradiation will be presented in this report.
High-efficiency-release targets for use at ISOL facilities: computational design
NASA Astrophysics Data System (ADS)
Liu, Y.; Alton, G. D.
1999-12-01
This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected targets and thermal analyses of temperature distributions within a prototype target/heat-sink system subjected to primary ion beam irradiation are presented in this report.
Program for computer aided reliability estimation
NASA Technical Reports Server (NTRS)
Mathur, F. P. (Inventor)
1972-01-01
A computer program for estimating the reliability of self-repair and fault-tolerant systems with respect to selected system and mission parameters is presented. The computer program is capable of operation in an interactive conversational mode as well as in a batch mode and is characterized by maintenance of several general equations representative of basic redundancy schemes in an equation repository. Selected reliability functions applicable to any mathematical model formulated with the general equations, used singly or in combination with each other, are separately stored. One or more system and/or mission parameters may be designated as a variable. Data in the form of values for selected reliability functions is generated in a tabular or graphic format for each formulated model.
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
Application of numerical grid generation for improved CFD analysis of multiphase screw machines
NASA Astrophysics Data System (ADS)
Rane, S.; Kovačević, A.
2017-08-01
Algebraic grid generation is widely used for discretization of the working domain of twin screw machines. Algebraic grid generation is fast and has good control over the placement of grid nodes. However, the desired qualities of grid which should be able to handle multiphase flows such as oil injection, may be difficult to achieve at times. In order to obtain fast solution of multiphase screw machines, it is important to further improve the quality and robustness of the computational grid. In this paper, a deforming grid of a twin screw machine is generated using algebraic transfinite interpolation to produce initial mesh upon which an elliptic partial differential equations (PDE) of the Poisson’s form is solved numerically to produce smooth final computational mesh. The quality of numerical cells and their distribution obtained by the differential method is greatly improved. In addition, a similar procedure was introduced to fully smoothen the transition of the partitioning rack curve between the rotors thus improving continuous movement of grid nodes and in turn improve robustness and speed of the Computational Fluid Dynamic (CFD) solver. Analysis of an oil injected twin screw compressor is presented to compare the improvements in grid quality factors in the regions of importance such as interlobe space, radial tip and the core of the rotor. The proposed method that combines algebraic and differential grid generation offer significant improvement in grid quality and robustness of numerical solution.
Deterministic generation of multiparticle entanglement by quantum Zeno dynamics.
Barontini, Giovanni; Hohmann, Leander; Haas, Florian; Estève, Jérôme; Reichel, Jakob
2015-09-18
Multiparticle entangled quantum states, a key resource in quantum-enhanced metrology and computing, are usually generated by coherent operations exclusively. However, unusual forms of quantum dynamics can be obtained when environment coupling is used as part of the state generation. In this work, we used quantum Zeno dynamics (QZD), based on nondestructive measurement with an optical microcavity, to deterministically generate different multiparticle entangled states in an ensemble of 36 qubit atoms in less than 5 microseconds. We characterized the resulting states by performing quantum tomography, yielding a time-resolved account of the entanglement generation. In addition, we studied the dependence of quantum states on measurement strength and quantified the depth of entanglement. Our results show that QZD is a versatile tool for fast and deterministic entanglement generation in quantum engineering applications. Copyright © 2015, American Association for the Advancement of Science.
Free-form surface measuring method based on optical theodolite measuring system
NASA Astrophysics Data System (ADS)
Yu, Caili
2012-10-01
The measurement for single-point coordinate, length and large-dimension curved surface in industrial measurement can be achieved through forward intersection measurement by the theodolite measuring system composed of several optical theodolites and one computer. The measuring principle of flexible large-dimension three-coordinate measuring system made up of multiple (above two) optical theodolites and composition and functions of the system have been introduced in this paper. Especially for measurement of curved surface, 3D measured data of spatial free-form surface is acquired through the theodolite measuring system and the CAD model is formed through surface fitting to directly generate CAM processing data.
Computer-Generated Feedback on Student Writing
ERIC Educational Resources Information Center
Ware, Paige
2011-01-01
A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…
Tomography and generative training with quantum Boltzmann machines
NASA Astrophysics Data System (ADS)
Kieferová, Mária; Wiebe, Nathan
2017-12-01
The promise of quantum neural nets, which utilize quantum effects to model complex data sets, has made their development an aspirational goal for quantum machine learning and quantum computing in general. Here we provide methods of training quantum Boltzmann machines. Our work generalizes existing methods and provides additional approaches for training quantum neural networks that compare favorably to existing methods. We further demonstrate that quantum Boltzmann machines enable a form of partial quantum state tomography that further provides a generative model for the input quantum state. Classical Boltzmann machines are incapable of this. This verifies the long-conjectured connection between tomography and quantum machine learning. Finally, we prove that classical computers cannot simulate our training process in general unless BQP=BPP , provide lower bounds on the complexity of the training procedures and numerically investigate training for small nonstoquastic Hamiltonians.
A two-level generative model for cloth representation and shape from shading.
Han, Feng; Zhu, Song-Chun
2007-07-01
In this paper, we present a two-level generative model for representing the images and surface depth maps of drapery and clothes. The upper level consists of a number of folds which will generate the high contrast (ridge) areas with a dictionary of shading primitives (for 2D images) and fold primitives (for 3D depth maps). These primitives are represented in parametric forms and are learned in a supervised learning phase using 3D surfaces of clothes acquired through photometric stereo. The lower level consists of the remaining flat areas which fill between the folds with a smoothness prior (Markov random field). We show that the classical ill-posed problem-shape from shading (SFS) can be much improved by this two-level model for its reduced dimensionality and incorporation of middle-level visual knowledge, i.e., the dictionary of primitives. Given an input image, we first infer the folds and compute a sketch graph using a sketch pursuit algorithm as in the primal sketch [10], [11]. The 3D folds are estimated by parameter fitting using the fold dictionary and they form the "skeleton" of the drapery/cloth surfaces. Then, the lower level is computed by conventional SFS method using the fold areas as boundary conditions. The two levels interact at the final stage by optimizing a joint Bayesian posterior probability on the depth map. We show a number of experiments which demonstrate more robust results in comparison with state-of-the-art work. In a broader scope, our representation can be viewed as a two-level inhomogeneous MRF model which is applicable to general shape-from-X problems. Our study is an attempt to revisit Marr's idea [23] of computing the 2(1/2)D sketch from primal sketch. In a companion paper [2], we study shape from stereo based on a similar two-level generative sketch representation.
Computation at a coordinate singularity
NASA Astrophysics Data System (ADS)
Prusa, Joseph M.
2018-05-01
Coordinate singularities are sometimes encountered in computational problems. An important example involves global atmospheric models used for climate and weather prediction. Classical spherical coordinates can be used to parameterize the manifold - that is, generate a grid for the computational spherical shell domain. This particular parameterization offers significant benefits such as orthogonality and exact representation of curvature and connection (Christoffel) coefficients. But it also exhibits two polar singularities and at or near these points typical continuity/integral constraints on dependent fields and their derivatives are generally inadequate and lead to poor model performance and erroneous results. Other parameterizations have been developed that eliminate polar singularities, but problems of weaker singularities and enhanced grid noise compared to spherical coordinates (away from the poles) persist. In this study reparameterization invariance of geometric objects (scalars, vectors and the forms generated by their covariant derivatives) is utilized to generate asymptotic forms for dependent fields of interest valid in the neighborhood of a pole. The central concept is that such objects cannot be altered by the metric structure of a parameterization. The new boundary conditions enforce symmetries that are required for transformations of geometric objects. They are implemented in an implicit polar filter of a structured grid, nonhydrostatic global atmospheric model that is simulating idealized Held-Suarez flows. A series of test simulations using different configurations of the asymptotic boundary conditions are made, along with control simulations that use the default model numerics with no absorber, at three different grid sizes. Typically the test simulations are ∼ 20% faster in wall clock time than the control-resulting from a decrease in noise at the poles in all cases. In the control simulations adverse numerical effects from the polar singularity are observed to increase with grid resolution. In contrast, test simulations demonstrate robust polar behavior independent of grid resolution.
Computational Models and Emergent Properties of Respiratory Neural Networks
Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.
2012-01-01
Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564
A fast, time-accurate unsteady full potential scheme
NASA Technical Reports Server (NTRS)
Shankar, V.; Ide, H.; Gorski, J.; Osher, S.
1985-01-01
The unsteady form of the full potential equation is solved in conservation form by an implicit method based on approximate factorization. At each time level, internal Newton iterations are performed to achieve time accuracy and computational efficiency. A local time linearization procedure is introduced to provide a good initial guess for the Newton iteration. A novel flux-biasing technique is applied to generate proper forms of the artificial viscosity to treat hyperbolic regions with shocks and sonic lines present. The wake is properly modeled by accounting not only for jumps in phi, but also for jumps in higher derivatives of phi, obtained by imposing the density to be continuous across the wake. The far field is modeled using the Riemann invariants to simulate nonreflecting boundary conditions. The resulting unsteady method performs well which, even at low reduced frequency levels of 0.1 or less, requires fewer than 100 time steps per cycle at transonic Mach numbers. The code is fully vectorized for the CRAY-XMP and the VPS-32 computers.
NASA Astrophysics Data System (ADS)
Morozov, Alexander; Dubinin, German; Dubynin, Sergey; Yanusik, Igor; Kim, Sun Il; Choi, Chil-Sung; Song, Hoon; Lee, Hong-Seok; Putilin, Andrey; Kopenkin, Sergey; Borodin, Yuriy
2017-06-01
Future commercialization of glasses-free holographic real 3D displays requires not only appropriate image quality but also slim design of backlight unit and whole display device to match market needs. While a lot of research aimed to solve computational issues of forming Computer Generated Holograms for 3D Holographic displays, less focus on development of backlight units suitable for 3D holographic display applications with form-factor of conventional 2D display systems. Thereby, we report coherent backlight unit for 3D holographic display with thickness comparable to commercially available 2D displays (cell phones, tablets, laptops, etc.). Coherent backlight unit forms uniform, high-collimated and effective illumination of spatial light modulator. Realization of such backlight unit is possible due to holographic optical elements, based on volume gratings, constructing coherent collimated beam to illuminate display plane. Design, recording and measurement of 5.5 inch coherent backlight unit based on two holographic optical elements are presented in this paper.
CAS2D: FORTRAN program for nonrotating blade-to-blade, steady, potential transonic cascade flows
NASA Technical Reports Server (NTRS)
Dulikravich, D. S.
1980-01-01
An exact, full-potential-equation (FPE) model for the steady, irrotational, homentropic and homoenergetic flow of a compressible, homocompositional, inviscid fluid through two dimensional planar cascades of airfoils was derived, together with its appropriate boundary conditions. A computer program, CAS2D, was developed that numerically solves an artificially time-dependent form of the actual FPE. The governing equation was discretized by using type-dependent, rotated finite differencing and the finite area technique. The flow field was discretized by providing a boundary-fitted, nonuniform computational mesh. The mesh was generated by using a sequence of conforming mapping, nonorthogonal coordinate stretching, and local, isoparametric, bilinear mapping functions. The discretized form of the FPE was solved iteratively by using successive line overrelaxation. The possible isentropic shocks were correctly captured by adding explicitly an artificial viscosity in a conservative form. In addition, a three-level consecutive, mesh refinement feature makes CAS2D a reliable and fast algorithm for the analysis of transonic, two dimensional cascade flows.
Control mechanism of double-rotator-structure ternary optical computer
NASA Astrophysics Data System (ADS)
Kai, SONG; Liping, YAN
2017-03-01
Double-rotator-structure ternary optical processor (DRSTOP) has two characteristics, namely, giant data-bits parallel computing and reconfigurable processor, which can handle thousands of data bits in parallel, and can run much faster than computers and other optical computer systems so far. In order to put DRSTOP into practical application, this paper established a series of methods, namely, task classification method, data-bits allocation method, control information generation method, control information formatting and sending method, and decoded results obtaining method and so on. These methods form the control mechanism of DRSTOP. This control mechanism makes DRSTOP become an automated computing platform. Compared with the traditional calculation tools, DRSTOP computing platform can ease the contradiction between high energy consumption and big data computing due to greatly reducing the cost of communications and I/O. Finally, the paper designed a set of experiments for DRSTOP control mechanism to verify its feasibility and correctness. Experimental results showed that the control mechanism is correct, feasible and efficient.
Real-time simulation of the retina allowing visualization of each processing stage
NASA Astrophysics Data System (ADS)
Teeters, Jeffrey L.; Werblin, Frank S.
1991-08-01
The retina computes to let us see, but can we see the retina compute? Until now, the answer has been no, because the unconscious nature of the processing hides it from our view. Here the authors describe a method of seeing computations performed throughout the retina. This is achieved by using neurophysiological data to construct a model of the retina, and using a special-purpose image processing computer (PIPE) to implement the model in real time. Processing in the model is organized into stages corresponding to computations performed by each retinal cell type. The final stage is the transient (change detecting) ganglion cell. A CCD camera forms the input image, and the activity of a selected retinal cell type is the output which is displayed on a TV monitor. By changing the retina cell driving the monitor, the progressive transformations of the image by the retina can be observed. These simulations demonstrate the ubiquitous presence of temporal and spatial variations in the patterns of activity generated by the retina which are fed into the brain. The dynamical aspects make these patterns very different from those generated by the common DOG (Difference of Gaussian) model of receptive field. Because the retina is so successful in biological vision systems, the processing described here may be useful in machine vision.
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee
1998-01-01
TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.
Lost in Second Life: Virtual Embodiment and Language Learning via Multimodal Communication
ERIC Educational Resources Information Center
Pasfield-Neofitou, Sarah; Huang, Hui; Grant, Scott
2015-01-01
Increased recognition of the role of the body and environment in cognition has taken place in recent decades in the form of new theories of embodied and extended cognition. The growing use of ever more sophisticated computer-generated 3D virtual worlds and avatars has added a new dimension to these theories of cognition. Both developments provide…
Toward mechanistic models of action-oriented and detached cognition.
Pezzulo, Giovanni
2016-01-01
To be successful, the research agenda for a novel control view of cognition should foresee more detailed, computationally specified process models of cognitive operations including higher cognition. These models should cover all domains of cognition, including those cognitive abilities that can be characterized as online interactive loops and detached forms of cognition that depend on internally generated neuronal processing.
McClure, Foster D; Lee, Jung-Keun
2003-01-01
The formula for the Horwitz ratio (HORRAT) as presented in the Study Director's Manual of AOAC INTERNATIONAL is applicable only when the concentration is in the unit/unit form (e.g., microg/microg, g/g, etc.). When the analyte concentration is a trace or mass fraction amount (e.g., microg/g), the formula generates incorrect HORRAT values. Alternative calculation procedures are presented to circumvent such problems.
Toward a molecular programming language for algorithmic self-assembly
NASA Astrophysics Data System (ADS)
Patitz, Matthew John
Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.
Graphic artist in computerland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolberg, K.M.
1983-01-01
The field of computer graphics is rapidly opening up to the graphic artist. It is not necessary to be a programming expert to enter this fascinating world. The capabilities of the medium are astounding: neon and metallic effects, translucent plastic and clear glass effects, sensitive 3-D shadings, limitless textures, and above all color. As with any medium, computer graphics has its advantages, such as speed, ease of form manipulation, and a variety of type fonts and alphabets. It also has its limitations, such as data input time, final output turnaround time, and not necessarily being the right medium for themore » job at hand. And finally, it is the time- and cost-saving characteristics of computer-generated visuals, opposed to original artwork, that make computer graphics a viable alternative. This paper focuses on parts of the computer graphics system in use at the Los Alamos National Laboratory to provide specific examples.« less
A secure file manager for UNIX
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeVries, R.G.
1990-12-31
The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure filemore » manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.« less
The aeroacoustics of supersonic jets
NASA Technical Reports Server (NTRS)
Morris, Philip J.; McLaughlin, Dennis K.
1995-01-01
This research project was a joint experimental/computational study of noise in supersonic jets. The experiments were performed in a low to moderate Reynolds number anechoic supersonic jet facility. Computations have focused on the modeling of the effect of an external shroud on the generation and radiation of jet noise. This report summarizes the results of the research program in the form of the Masters and Doctoral theses of those students who obtained their degrees with the assistance of this research grant. In addition, the presentations and publications made by the principal investigators and the research students is appended.
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.
1993-01-01
The issue of developing effective and robust schemes to implement a class of the Ogden-type hyperelastic constitutive models is addressed. To this end, special purpose functions (running under MACSYMA) are developed for the symbolic derivation, evaluation, and automatic FORTRAN code generation of explicit expressions for the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid over the entire deformation range, since the singularities resulting from repeated principal-stretch values have been theoretically removed. The required computational algorithms are outlined, and the resulting FORTRAN computer code is presented.
Using CASE tools to write engineering specifications
NASA Astrophysics Data System (ADS)
Henry, James E.; Howard, Robert W.; Iveland, Scott T.
1993-08-01
There are always a wide variety of obstacles to writing and maintaining engineering documentation. To combat these problems, documentation generation can be linked to the process of engineering development. The same graphics and communication tools used for structured system analysis and design (SSA/SSD) also form the basis for the documentation. The goal is to build a living document, such that as an engineering design changes, the documentation will `automatically' revise. `Automatic' is qualified by the need to maintain textual descriptions associated with the SSA/SSD graphics, and the need to generate new documents. This paper describes a methodology and a computer aided system engineering toolset that enables a relatively seamless transition into document generation for the development engineering team.
Zilka, Miri; Dudenko, Dmytro V; Hughes, Colan E; Williams, P Andrew; Sturniolo, Simone; Franks, W Trent; Pickard, Chris J; Yates, Jonathan R; Harris, Kenneth D M; Brown, Steven P
2017-10-04
This paper explores the capability of using the DFT-D ab initio random structure searching (AIRSS) method to generate crystal structures of organic molecular materials, focusing on a system (m-aminobenzoic acid; m-ABA) that is known from experimental studies to exhibit abundant polymorphism. Within the structural constraints selected for the AIRSS calculations (specifically, centrosymmetric structures with Z = 4 for zwitterionic m-ABA molecules), the method is shown to successfully generate the two known polymorphs of m-ABA (form III and form IV) that have these structural features. We highlight various issues that are encountered in comparing crystal structures generated by AIRSS to experimental powder X-ray diffraction (XRD) data and solid-state magic-angle spinning (MAS) NMR data, demonstrating successful fitting for some of the lowest energy structures from the AIRSS calculations against experimental low-temperature powder XRD data for known polymorphs of m-ABA, and showing that comparison of computed and experimental solid-state NMR parameters allows different hydrogen-bonding motifs to be discriminated.
Auto-recognition of surfaces and auto-generation of material removal volume for finishing process
NASA Astrophysics Data System (ADS)
Kataraki, Pramod S.; Salman Abu Mansor, Mohd
2018-03-01
Auto-recognition of a surface and auto-generation of material removal volumes for the so recognised surfaces has become a need to achieve successful downstream manufacturing activities like automated process planning and scheduling. Few researchers have contributed to generation of material removal volume for a product but resulted in material removal volume discontinuity between two adjacent material removal volumes generated from two adjacent faces that form convex geometry. The need for limitation free material removal volume generation was attempted and an algorithm that automatically recognises computer aided design (CAD) model’s surface and also auto-generate material removal volume for finishing process of the recognised surfaces was developed. The surfaces of CAD model are successfully recognised by the developed algorithm and required material removal volume is obtained. The material removal volume discontinuity limitation that occurred in fewer studies is eliminated.
Boruah, B R; Neil, M A A
2009-01-01
We describe the design and construction of a laser scanning confocal microscope with programmable beam forming optics. The amplitude, phase, and polarization of the laser beam used in the microscope can be controlled in real time with the help of a liquid crystal spatial light modulator, acting as a computer generated hologram, in conjunction with a polarizing beam splitter and two right angled prisms assembly. Two scan mirrors, comprising an on-axis fast moving scan mirror for line scanning and an off-axis slow moving scan mirror for frame scanning, configured in a way to minimize the movement of the scanned beam over the pupil plane of the microscope objective, form the XY scan unit. The confocal system, that incorporates the programmable beam forming unit and the scan unit, has been implemented to image in both reflected and fluorescence light from the specimen. Efficiency of the system to programmably generate custom defined vector beams has been demonstrated by generating a bottle structured focal volume, which in fact is the overlap of two cross polarized beams, that can simultaneously improve both the lateral and axial resolutions if used as the de-excitation beam in a stimulated emission depletion confocal microscope.
Simulation and visualization of face seal motion stability by means of computer generated movies
NASA Technical Reports Server (NTRS)
Etsion, I.; Auer, B. M.
1980-01-01
A computer aided design method for mechanical face seals is described. Based on computer simulation, the actual motion of the flexibly mounted element of the seal can be visualized. This is achieved by solving the equations of motion of this element, calculating the displacements in its various degrees of freedom vs. time, and displaying the transient behavior in the form of a motion picture. Incorporating such a method in the design phase allows one to detect instabilities and to correct undesirable behavior of the seal. A theoretical background is presented. Details of the motion display technique are described, and the usefulness of the method is demonstrated by an example of a noncontacting conical face seal.
Simulation and visualization of face seal motion stability by means of computer generated movies
NASA Technical Reports Server (NTRS)
Etsion, I.; Auer, B. M.
1981-01-01
A computer aided design method for mechanical face seals is described. Based on computer simulation, the actual motion of the flexibly mounted element of the seal can be visualized. This is achieved by solving the equations of motion of this element, calculating the displacements in its various degrees of freedom vs. time, and displaying the transient behavior in the form of a motion picture. Incorporating such a method in the design phase allows one to detect instabilities and to correct undesirable behavior of the seal. A theoretical background is presented. Details of the motion display technique are described, and the usefulness of the method is demonstrated by an example of a noncontacting conical face seal.
NASA Technical Reports Server (NTRS)
Estes, R. H.
1977-01-01
A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.
Prediction of sound radiation from different practical jet engine inlets
NASA Technical Reports Server (NTRS)
Zinn, B. T.; Meyer, W. L.
1982-01-01
The computer codes necessary for this study were developed and checked against exact solutions generated by the point source method using the NASA Lewis QCSEE inlet geometry. These computer codes were used to predict the acoustic properties of the following five inlet configurations: the NASA Langley Bellmouth, the NASA Lewis JT15D-1 Ground Test Nacelle, and three finite hyperbolic inlets of 50, 70 and 90 degrees. Thirty-five computer runs were done for the NASA Langley Bellmouth. For each of these computer runs, the reflection coefficient at the duct exit plane was calculated as was the far field radiation pattern. These results are presented in both graphical and tabular form with many of the results cross plotted so that trends in the results verses cut-off ratio (wave number) and tangential mode number may be easily identified.
Modern Methods for fast generation of digital holograms
NASA Astrophysics Data System (ADS)
Tsang, P. W. M.; Liu, J. P.; Cheung, K. W. K.; Poon, T.-C.
2010-06-01
With the advancement of computers, digital holography (DH) has become an area of interest that has gained much popularity. Research findings derived from this technology enables holograms representing three dimensional (3-D) scenes to be acquired with optical means, or generated with numerical computation. In both cases, the holograms are in the form of numerical data that can be recorded, transmitted, and processed with digital techniques. On top of that, the availability of high capacity digital storage and wide-band communication technologies also cast light on the emergence of real time video holographic systems, enabling animated 3-D contents to be encoded as holographic data, and distributed via existing medium. At present, development in DH has reached a reasonable degree of maturity, but at the same time the heavy computation involved also imposes difficulty in practical applications. In this paper, a summary on a number of successful accomplishments that have been made recently in overcoming this problem is presented. Subsequently, we shall propose an economical framework that is suitable for real time generation and transmission of holographic video signals over existing distribution media. The proposed framework includes an aspect of extending the depth range of the object scene, which is important for the display of large-scale objects. [Figure not available: see fulltext.
NASA Technical Reports Server (NTRS)
Wallace, G. R.; Weathers, G. D.; Graf, E. R.
1973-01-01
The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.
Clement, R; Schneider, J; Brambs, H-J; Wunderlich, A; Geiger, M; Sander, F G
2004-02-01
The paper demonstrates how to generate an individual 3D volume model of a human single-rooted tooth using an automatic workflow. It can be implemented into finite element simulation. In several computational steps, computed tomography data of patients are used to obtain the global coordinates of the tooth's surface. First, the large number of geometric data is processed with several self-developed algorithms for a significant reduction. The most important task is to keep geometrical information of the real tooth. The second main part includes the creation of the volume model for tooth and periodontal ligament (PDL). This is realized with a continuous free form surface of the tooth based on the remaining points. Generating such irregular objects for numerical use in biomechanical research normally requires enormous manual effort and time. The finite element mesh of the tooth, consisting of hexahedral elements, is composed of different materials: dentin, PDL and surrounding alveolar bone. It is capable of simulating tooth movement in a finite element analysis and may give valuable information for a clinical approach without the restrictions of tetrahedral elements. The mesh generator of FE software ANSYS executed the mesh process for hexahedral elements successfully.
A Machine LearningFramework to Forecast Wave Conditions
NASA Astrophysics Data System (ADS)
Zhang, Y.; James, S. C.; O'Donncha, F.
2017-12-01
Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.
Exhaust Nozzle Plume and Shock Wave Interaction
NASA Technical Reports Server (NTRS)
Castner, Raymond S.; Elmiligui, Alaa; Cliff, Susan
2013-01-01
Fundamental research for sonic boom reduction is needed to quantify the interaction of shock waves generated from the aircraft wing or tail surfaces with the exhaust plume. Both the nozzle exhaust plume shape and the tail shock shape may be affected by an interaction that may alter the vehicle sonic boom signature. The plume and shock interaction was studied using Computational Fluid Dynamics simulation on two types of convergent-divergent nozzles and a simple wedge shock generator. The nozzle plume effects on the lower wedge compression region are evaluated for two- and three-dimensional nozzle plumes. Results show that the compression from the wedge deflects the nozzle plume and shocks form on the deflected lower plume boundary. The sonic boom pressure signature of the wedge is modified by the presence of the plume, and the computational predictions show significant (8 to 15 percent) changes in shock amplitude.
A fully vectorized numerical solution of the incompressible Navier-Stokes equations. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Patel, N.
1983-01-01
A vectorizable algorithm is presented for the implicit finite difference solution of the incompressible Navier-Stokes equations in general curvilinear coordinates. The unsteady Reynolds averaged Navier-Stokes equations solved are in two dimension and non-conservative primitive variable form. A two-layer algebraic eddy viscosity turbulence model is used to incorporate the effects of turbulence. Two momentum equations and a Poisson pressure equation, which is obtained by taking the divergence of the momentum equations and satisfying the continuity equation, are solved simultaneously at each time step. An elliptic grid generation approach is used to generate a boundary conforming coordinate system about an airfoil. The governing equations are expressed in terms of the curvilinear coordinates and are solved on a uniform rectangular computational domain. A checkerboard SOR, which can effectively utilize the computer architectural concept of vector processing, is used for iterative solution of the governing equations.
Wedge Shock and Nozzle Exhaust Plume Interaction in a Supersonic Jet Flow
NASA Technical Reports Server (NTRS)
Castner, Raymond; Zaman, Khairul; Fagan, Amy; Heath, Christopher
2014-01-01
Fundamental research for sonic boom reduction is needed to quantify the interaction of shock waves generated from the aircraft wing or tail surfaces with the nozzle exhaust plume. Aft body shock waves that interact with the exhaust plume contribute to the near-field pressure signature of a vehicle. The plume and shock interaction was studied using computational fluid dynamics and compared with experimental data from a coaxial convergent-divergent nozzle flow in an open jet facility. A simple diamond-shaped wedge was used to generate the shock in the outer flow to study its impact on the inner jet flow. Results show that the compression from the wedge deflects the nozzle plume and shocks form on the opposite plume boundary. The sonic boom pressure signature of the nozzle exhaust plume was modified by the presence of the wedge. Both the experimental results and computational predictions show changes in plume deflection.
NASA Astrophysics Data System (ADS)
Khaimovich, A. I.; Khaimovich, I. N.
2018-01-01
The articles provides the calculation algorithms for blank design and die forming fitting to produce the compressor blades for aircraft engines. The design system proposed in the article allows generating drafts of trimming and reducing dies automatically, leading to significant reduction of work preparation time. The detailed analysis of the blade structural elements features was carried out, the taken limitations and technological solutions allowed to form generalized algorithms of forming parting stamp face over the entire circuit of the engraving for different configurations of die forgings. The author worked out the algorithms and programs to calculate three dimensional point locations describing the configuration of die cavity.
Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity.
Zander, Thorsten O; Krol, Laurens R; Birbaumer, Niels P; Gramann, Klaus
2016-12-27
The effectiveness of today's human-machine interaction is limited by a communication bottleneck as operators are required to translate high-level concepts into a machine-mandated sequence of instructions. In contrast, we demonstrate effective, goal-oriented control of a computer system without any form of explicit communication from the human operator. Instead, the system generated the necessary input itself, based on real-time analysis of brain activity. Specific brain responses were evoked by violating the operators' expectations to varying degrees. The evoked brain activity demonstrated detectable differences reflecting congruency with or deviations from the operators' expectations. Real-time analysis of this activity was used to build a user model of those expectations, thus representing the optimal (expected) state as perceived by the operator. Based on this model, which was continuously updated, the computer automatically adapted itself to the expectations of its operator. Further analyses showed this evoked activity to originate from the medial prefrontal cortex and to exhibit a linear correspondence to the degree of expectation violation. These findings extend our understanding of human predictive coding and provide evidence that the information used to generate the user model is task-specific and reflects goal congruency. This paper demonstrates a form of interaction without any explicit input by the operator, enabling computer systems to become neuroadaptive, that is, to automatically adapt to specific aspects of their operator's mindset. Neuroadaptive technology significantly widens the communication bottleneck and has the potential to fundamentally change the way we interact with technology.
Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Maltach, E. G.
1969-01-01
The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.
PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 3: Programmer's reference
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady
1990-01-01
A new computer code was developed to solve the 2-D or axisymmetric, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating-direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 3 is the Programmer's Reference, and describes the program structure, the FORTRAN variables stored in common blocks, and the details of each subprogram.
Near- and far-field aerodynamics in insect hovering flight: an integrated computational study.
Aono, Hikaru; Liang, Fuyou; Liu, Hao
2008-01-01
We present the first integrative computational fluid dynamics (CFD) study of near- and far-field aerodynamics in insect hovering flight using a biology-inspired, dynamic flight simulator. This simulator, which has been built to encompass multiple mechanisms and principles related to insect flight, is capable of 'flying' an insect on the basis of realistic wing-body morphologies and kinematics. Our CFD study integrates near- and far-field wake dynamics and shows the detailed three-dimensional (3D) near- and far-field vortex flows: a horseshoe-shaped vortex is generated and wraps around the wing in the early down- and upstroke; subsequently, the horseshoe-shaped vortex grows into a doughnut-shaped vortex ring, with an intense jet-stream present in its core, forming the downwash; and eventually, the doughnut-shaped vortex rings of the wing pair break up into two circular vortex rings in the wake. The computed aerodynamic forces show reasonable agreement with experimental results in terms of both the mean force (vertical, horizontal and sideslip forces) and the time course over one stroke cycle (lift and drag forces). A large amount of lift force (approximately 62% of total lift force generated over a full wingbeat cycle) is generated during the upstroke, most likely due to the presence of intensive and stable, leading-edge vortices (LEVs) and wing tip vortices (TVs); and correspondingly, a much stronger downwash is observed compared to the downstroke. We also estimated hovering energetics based on the computed aerodynamic and inertial torques, and powers.
Establishing Linux Clusters for High-Performance Computing (HPC) at NPS
2004-09-01
52 e. Intel Roll..................................................................................53 f. Area51 Roll...results of generating md5summ for Area51 roll. All the file information is available. This number can be used to be checked against the number that the...vendor provides fro the particular piece of software. ......51 Figure 22 The given md5summ for Area51 roll form the download site. This number can
Transient multivariable sensor evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Heifetz, Alexander
A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.
Non-Roman Font Generation Via Interactive Computer Graphics,
1986-07-01
sets of kana representing the same set of sounds: hiragana , a cursive script for transcribing native Japanese words (including those borrowed low from...used for transcribing spoken Japanese into dwritten language. Hiragana have a cursive (handwritten) appearance. homophone A syllable or word which is...language into written form. These symbol sets are syllabaries. (see also hiragana , katakana) kanji "Chinese characters" ( Japanese ). (see also hanzi
Towards data warehousing and mining of protein unfolding simulation data.
Berrar, Daniel; Stahl, Frederic; Silva, Candida; Rodrigues, J Rui; Brito, Rui M M; Dubitzky, Werner
2005-10-01
The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.
Computational design of active, self-reinforcing gels.
Yashin, Victor V; Kuksenok, Olga; Balazs, Anna C
2010-05-20
Many living organisms have evolved a protective mechanism that allows them to reversibly alter their stiffness in response to mechanical contact. Using theoretical modeling, we design a mechanoresponsive polymer gel that exhibits a similar self-reinforcing behavior. We focus on cross-linked gels that contain Ru(terpy)(2) units, where both terpyridine ligands are grafted to the chains. The Ru(terpy)(2) complex forms additional, chemoresponsive cross-links that break and re-form in response to a repeated oxidation and reduction of the Ru. In our model, the periodic redox variations of the anchored metal ion are generated by the Belousov-Zhabotinsky (BZ) reaction. Our computer simulations reveal that compression of the BZ gel leads to a stiffening of the sample due to an increase in the cross-link density. These findings provide guidelines for designing biomimetic, active coatings that send out a signal when the system is impacted and use this signaling process to initiate the self-protecting behavior.
Using ordinal partition transition networks to analyze ECG data
NASA Astrophysics Data System (ADS)
Kulp, Christopher W.; Chobot, Jeremy M.; Freitas, Helena R.; Sprechini, Gene D.
2016-07-01
Electrocardiogram (ECG) data from patients with a variety of heart conditions are studied using ordinal pattern partition networks. The ordinal pattern partition networks are formed from the ECG time series by symbolizing the data into ordinal patterns. The ordinal patterns form the nodes of the network and edges are defined through the time ordering of the ordinal patterns in the symbolized time series. A network measure, called the mean degree, is computed from each time series-generated network. In addition, the entropy and number of non-occurring ordinal patterns (NFP) is computed for each series. The distribution of mean degrees, entropies, and NFPs for each heart condition studied is compared. A statistically significant difference between healthy patients and several groups of unhealthy patients with varying heart conditions is found for the distributions of the mean degrees, unlike for any of the distributions of the entropies or NFPs.
Renormalization group, normal form theory and the Ising model
NASA Astrophysics Data System (ADS)
Raju, Archishman; Hayden, Lorien; Clement, Colin; Liarte, Danilo; Sethna, James
The results of the renormalization group are commonly advertised as the existence of power law singularities at critical points. Logarithmic and exponential corrections are seen as special cases and dealt with on a case-by-case basis. We propose to systematize computing the singularities in the renormalization group using perturbative normal form theory. This gives us a way to classify all such singularities in a unified framework and to generate a systematic machinery to do scaling collapses. We show that this procedure leads to some new results even in classic cases like the Ising model and has general applicability.
NASA Astrophysics Data System (ADS)
Waite, C. T.
2013-04-01
Moonwalk is a stroll on the Moon through time and space, a lyrical history of humanity's scientific and allegorical relationship with the Moon from the beginnings of culture to the Space Age and the memories of the Cold-War generation. It is an experimental film in both genre and form, a computer animation designed for projection on a planetarium cupola. A hemispherical film, Moonwalk creates an immersive experience. The fulldome format presents aesthetic and technical challenges to create a new form of imagery and spatial montage. A seven-minute excerpt of the work-in-progress was shown at INSAPV in the Adler Planetarium, Chicago.
Kumar, Vipul; Punetha, Ankita; Sundar, Durai; Chaudhuri, Tapan K
2012-01-01
Molecular chaperones appear to have been evolved to facilitate protein folding in the cell through entrapment of folding intermediates on the interior of a large cavity formed between GroEL and its co-chaperonin GroES. They bind newly synthesized or non-native polypeptides through hydrophobic interactions and prevent their aggregation. Some proteins do not interact with GroEL, hence even though they are aggregation prone, cannot be assisted by GroEL for their folding. In this study, we have attempted to engineer these non-substrate proteins to convert them as the substrate for GroEL, without compromising on their function. We have used a computational biology approach to generate mutants of the selected proteins by selectively mutating residues in the hydrophobic patch, similar to GroES mobile loop region that are responsible for interaction with GroEL, and compared with the wild counterparts for calculation of their instability and aggregation propensities. The energies of the newly designed mutants were computed through molecular dynamics simulations. We observed increased aggregation propensity of some of the mutants formed after replacing charged amino acid residues with hydrophobic ones in the well defined hydrophobic patch, raising the possibility of their binding ability to GroEL. The newly generated mutants may provide potential substrates for Chaperonin GroEL, which can be experimentally generated and tested for their tendency of aggregation, interactions with GroEL and the possibility of chaperone-assisted folding to produce functional proteins.
Exact posterior computation in non-conjugate Gaussian location-scale parameters models
NASA Astrophysics Data System (ADS)
Andrade, J. A. A.; Rathie, P. N.
2017-12-01
In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.
Solute solver 'what if' module for modeling urea kinetics.
Daugirdas, John T
2016-11-01
The publicly available Solute Solver module allows calculation of a variety of two-pool urea kinetic measures of dialysis adequacy using pre- and postdialysis plasma urea and estimated dialyzer clearance or estimated urea distribution volumes as inputs. However, the existing program does not have a 'what if' module, which would estimate the plasma urea values as well as commonly used measures of hemodialysis adequacy for a patient with a given urea distribution volume and urea nitrogen generation rate dialyzed according to a particular dialysis schedule. Conventional variable extracellular volume 2-pool urea kinetic equations were used. A javascript-HTML Web form was created that can be used on any personal computer equipped with internet browsing software, to compute commonly used Kt/V-based measures of hemodialysis adequacy for patients with differing amounts of residual kidney function and following a variety of treatment schedules. The completed Web form calculator may be particularly useful in computing equivalent continuous clearances for incremental hemodialysis strategies. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsai, Chun-Wei; Lyu, Bo-Han; Wang, Chen; Hung, Cheng-Chieh
2017-05-01
We have already developed multi-function and easy-to-use modulation software that was based on LabVIEW system. There are mainly four functions in this modulation software, such as computer generated holograms (CGH) generation, CGH reconstruction, image trimming, and special phase distribution. Based on the above development of CGH modulation software, we could enhance the performance of liquid crystal on silicon - spatial light modulator (LCoSSLM) as similar as the diffractive optical element (DOE) and use it on various adaptive optics (AO) applications. Through the development of special phase distribution, we are going to use the LCoS-SLM with CGH modulation software into AO technology, such as optical microscope system. When the LCOS-SLM panel is integrated in an optical microscope system, it could be placed on the illumination path or on the image forming path. However, LCOS-SLM provides a program-controllable liquid crystal array for optical microscope. It dynamically changes the amplitude or phase of light and gives the obvious advantage, "Flexibility", to the system
A Numerical Method of Calculating Propeller Noise Including Acoustic Nonlinear Effects
NASA Technical Reports Server (NTRS)
Korkan, K. D.
1985-01-01
Using the transonic flow fields(s) generated by the NASPROP-E computer code for an eight blade SR3-series propeller, a theoretical method is investigated to calculate the total noise values and frequency content in the acoustic near and far field without using the Ffowcs Williams - Hawkings equation. The flow field is numerically generated using an implicit three dimensional Euler equation solver in weak conservation law form. Numerical damping is required by the differencing method for stability in three dimensions, and the influence of the damping on the calculated acoustic values is investigated. The acoustic near field is solved by integrating with respect to time the pressure oscillations induced at a stationary observer location. The acoustic far field is calculated from the near field primitive variables as generated by NASPROP-E computer code using a method involving a perturbation velocity potential as suggested by Hawkings in the calculation of the acoustic pressure time-history at a specified far field observed location. the methodologies described are valid for calculating total noise levels and are applicable to any propeller geometry for which a flow field solution is available.
Holo-Chidi video concentrator card
NASA Astrophysics Data System (ADS)
Nwodoh, Thomas A.; Prabhakar, Aditya; Benton, Stephen A.
2001-12-01
The Holo-Chidi Video Concentrator Card is a frame buffer for the Holo-Chidi holographic video processing system. Holo- Chidi is designed at the MIT Media Laboratory for real-time computation of computer generated holograms and the subsequent display of the holograms at video frame rates. The Holo-Chidi system is made of two sets of cards - the set of Processor cards and the set of Video Concentrator Cards (VCCs). The Processor cards are used for hologram computation, data archival/retrieval from a host system, and for higher-level control of the VCCs. The VCC formats computed holographic data from multiple hologram computing Processor cards, converting the digital data to analog form to feed the acousto-optic-modulators of the Media lab's Mark-II holographic display system. The Video Concentrator card is made of: a High-Speed I/O (HSIO) interface whence data is transferred from the hologram computing Processor cards, a set of FIFOs and video RAM used as buffer for data for the hololines being displayed, a one-chip integrated microprocessor and peripheral combination that handles communication with other VCCs and furnishes the card with a USB port, a co-processor which controls display data formatting, and D-to-A converters that convert digital fringes to analog form. The co-processor is implemented with an SRAM-based FPGA with over 500,000 gates and controls all the signals needed to format the data from the multiple Processor cards into the format required by Mark-II. A VCC has three HSIO ports through which up to 500 Megabytes of computed holographic data can flow from the Processor Cards to the VCC per second. A Holo-Chidi system with three VCCs has enough frame buffering capacity to hold up to thirty two 36Megabyte hologram frames at a time. Pre-computed holograms may also be loaded into the VCC from a host computer through the low- speed USB port. Both the microprocessor and the co- processor in the VCC can access the main system memory used to store control programs and data for the VCC. The Card also generates the control signals used by the scanning mirrors of Mark-II. In this paper we discuss the design of the VCC and its implementation in the Holo-Chidi system.
NASA Technical Reports Server (NTRS)
Quek, Kok How Francis
1990-01-01
A method of computing reliable Gaussian and mean curvature sign-map descriptors from the polynomial approximation of surfaces was demonstrated. Such descriptors which are invariant under perspective variation are suitable for hypothesis generation. A means for determining the pose of constructed geometric forms whose algebraic surface descriptors are nonlinear in terms of their orienting parameters was developed. This was done by means of linear functions which are capable of approximating nonlinear forms and determining their parameters. It was shown that biquadratic surfaces are suitable companion linear forms for cylindrical approximation and parameter estimation. The estimates provided the initial parametric approximations necessary for a nonlinear regression stage to fine tune the estimates by fitting the actual nonlinear form to the data. A hypothesis-based split-merge algorithm for extraction and pose determination of cylinders and planes which merge smoothly into other surfaces was developed. It was shown that all split-merge algorithms are hypothesis-based. A finite-state algorithm for the extraction of the boundaries of run-length regions was developed. The computation takes advantage of the run list topology and boundary direction constraints implicit in the run-length encoding.
Improved Hybrid Modeling of Spent Fuel Storage Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bibber, Karl van
This work developed a new computational method for improving the ability to calculate the neutron flux in deep-penetration radiation shielding problems that contain areas with strong streaming. The “gold standard” method for radiation transport is Monte Carlo (MC) as it samples the physics exactly and requires few approximations. Historically, however, MC was not useful for shielding problems because of the computational challenge of following particles through dense shields. Instead, deterministic methods, which are superior in term of computational effort for these problems types but are not as accurate, were used. Hybrid methods, which use deterministic solutions to improve MC calculationsmore » through a process called variance reduction, can make it tractable from a computational time and resource use perspective to use MC for deep-penetration shielding. Perhaps the most widespread and accessible of these methods are the Consistent Adjoint Driven Importance Sampling (CADIS) and Forward-Weighted CADIS (FW-CADIS) methods. For problems containing strong anisotropies, such as power plants with pipes through walls, spent fuel cask arrays, active interrogation, and locations with small air gaps or plates embedded in water or concrete, hybrid methods are still insufficiently accurate. In this work, a new method for generating variance reduction parameters for strongly anisotropic, deep penetration radiation shielding studies was developed. This method generates an alternate form of the adjoint scalar flux quantity, Φ Ω, which is used by both CADIS and FW-CADIS to generate variance reduction parameters for local and global response functions, respectively. The new method, called CADIS-Ω, was implemented in the Denovo/ADVANTG software. Results indicate that the flux generated by CADIS-Ω incorporates localized angular anisotropies in the flux more effectively than standard methods. CADIS-Ω outperformed CADIS in several test problems. This initial work indicates that CADIS- may be highly useful for shielding problems with strong angular anisotropies. This is a benefit to the public by increasing accuracy for lower computational effort for many problems that have energy, security, and economic importance.« less
Galaxy Evolution in the Radio Band: The Role of Star-forming Galaxies and Active Galactic Nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mancuso, C.; Prandoni, I.; Lapi, A.
We investigate the astrophysics of radio-emitting star-forming galaxies and active galactic nuclei (AGNs) and elucidate their statistical properties in the radio band, including luminosity functions, redshift distributions, and number counts at sub-mJy flux levels, which will be crucially probed by next-generation radio continuum surveys. Specifically, we exploit the model-independent approach by Mancuso et al. to compute the star formation rate functions, the AGN duty cycles, and the conditional probability of a star-forming galaxy to host an AGN with given bolometric luminosity. Coupling these ingredients with the radio emission properties associated with star formation and nuclear activity, we compute relevant statisticsmore » at different radio frequencies and disentangle the relative contribution of star-forming galaxies and AGNs in different radio luminosity, radio flux, and redshift ranges. Finally, we highlight that radio-emitting star-forming galaxies and AGNs are expected to host supermassive black holes accreting with different Eddington ratio distributions and to occupy different loci in the galaxy main-sequence diagrams. These specific predictions are consistent with current data sets but need to be tested with larger statistics via future radio data with multiband coverage on wide areas, as will become routinely achievable with the advent of the Square Kilometre Array and its precursors.« less
Arhin, Afua Ottie; Johnson-Mallard, Versie
2003-01-01
A majority of students in the classrooms of colleges and universities today, are a product of a generation of latch key kids in which daycare, babysitters, television, and computers serve as surrogate parents. With the proliferation of technology, the internet, beepers and cell phones have become social lifelines for this generation. They are technology savvy, independent and resourceful. Conditioned to expect immediate gratification, these youth have shorter attention spans and also a low threshold for boredom. It can be quite a challenge for educators to keep these young people engaged in the classroom. This paper presents an innovative teaching/learning strategy used in a nursing school in Florida that accommodates the unique characteristics of these learners.
Differential theory of learning for efficient neural network pattern recognition
NASA Astrophysics Data System (ADS)
Hampshire, John B., II; Vijaya Kumar, Bhagavatula
1993-09-01
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generate well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
NASA Technical Reports Server (NTRS)
Murray, N. D.
1985-01-01
Current technology projections indicate a lack of availability of special purpose computing for Space Station applications. Potential functions for video image special purpose processing are being investigated, such as smoothing, enhancement, restoration and filtering, data compression, feature extraction, object detection and identification, pixel interpolation/extrapolation, spectral estimation and factorization, and vision synthesis. Also, architectural approaches are being identified and a conceptual design generated. Computationally simple algorithms will be research and their image/vision effectiveness determined. Suitable algorithms will be implimented into an overall architectural approach that will provide image/vision processing at video rates that are flexible, selectable, and programmable. Information is given in the form of charts, diagrams and outlines.
NASA Technical Reports Server (NTRS)
Thompson, T. W.; Cutts, J. A.
1981-01-01
A catalog of lunar and radar anomalies was generated to provide a base for comparison with Venusian radar signatures. The relationships between lunar radar anomalies and regolith processes were investigated, and a consortium was formed to compare lunar and Venusian radar images of craters. Time was scheduled at the Arecibo Observatory to use the 430 MHz radar to obtain high resolution radar maps of six areas of the lunar suface. Data from 1978 observations of Mare Serenitas and Plato are being analyzed on a PDP 11/70 computer to construct the computer program library necessary for the eventual reduction of the May 1981 and subsequent data acquisitions. Papers accepted for publication are presented.
Modeling of dialogue regimes of distance robot control
NASA Astrophysics Data System (ADS)
Larkin, E. V.; Privalov, A. N.
2017-02-01
Process of distance control of mobile robots is investigated. Petri-Markov net for modeling of dialogue regime is worked out. It is shown, that sequence of operations of next subjects: a human operator, a dialogue computer and an onboard computer may be simulated with use the theory of semi-Markov processes. From the semi-Markov process of the general form Markov process was obtained, which includes only states of transaction generation. It is shown, that a real transaction flow is the result of «concurrency» in states of Markov process. Iteration procedure for evaluation of transaction flow parameters, which takes into account effect of «concurrency», is proposed.
Prosodic alignment in human-computer interaction
NASA Astrophysics Data System (ADS)
Suzuki, N.; Katagiri, Y.
2007-06-01
Androids that replicate humans in form also need to replicate them in behaviour to achieve a high level of believability or lifelikeness. We explore the minimal social cues that can induce in people the human tendency for social acceptance, or ethopoeia, toward artifacts, including androids. It has been observed that people exhibit a strong tendency to adjust to each other, through a number of speech and language features in human-human conversational interactions, to obtain communication efficiency and emotional engagement. We investigate in this paper the phenomena related to prosodic alignment in human-computer interactions, with particular focus on human-computer alignment of speech characteristics. We found that people exhibit unidirectional and spontaneous short-term alignment of loudness and response latency in their speech in response to computer-generated speech. We believe this phenomenon of prosodic alignment provides one of the key components for building social acceptance of androids.
Recent trends in digital human modeling and the concurrent issues that face human modeling approach
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Gonzalez, L. Javier; Margerum, Sarah; Clowers, Kurt; Moreny, Richard; Abercomby, Andrew; Velasquez, Luis
2006-01-01
Tremendous strides have been made in the recent years to digitally represent human beings in computer simulation models ranging from assembly plant maintenance operations to occupants getting in and out of vehicles to action movie scenarios. While some of these tools are being actively pursued by the engineering communities, there is still a lot of work that remains to be done for the newly planned planetary exploration missions. For example, certain unique and several common challenges are seen in developing computer generated suited human models for designing the next generation space vehicle. The purpose of this presentation is to discuss NASA s potential needs for better human models and to show also many of the inherent yet not too obvious pitfalls that still are left unresolved in this new arena of digital human modeling. As part of NASA s Habitability and Human Factors Branch, the Anthropometry and Biomechanics Facility has been engaged in studying the various facets of computer generated human physical performance models; for instance, it has been engaged in utilizing three-dimensional laser scan data along with three dimensional video based motion and reach data to gather suited anthropometric and shape and size information that are not available yet in the form of computer mannequins. Our goal is to bring in new approaches to deal with heavily clothed humans (such as, suited astronauts) and to overcome the current limitations of wrongly identifying humans (either real or virtual) as univariate percentiles. We are looking at whole-body posture based anthropometric models as a means to identify humans of significantly different shapes and sizes to arrive at mathematically sound computer models for analytical purposes.
Propulsion/flight control integration technology (PROFIT) software system definition
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.
Excoffier, Laurent; Lischer, Heidi E L
2010-05-01
We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.
Structural optimization with approximate sensitivities
NASA Technical Reports Server (NTRS)
Patnaik, S. N.; Hopkins, D. A.; Coroneos, R.
1994-01-01
Computational efficiency in structural optimization can be enhanced if the intensive computations associated with the calculation of the sensitivities, that is, gradients of the behavior constraints, are reduced. Approximation to gradients of the behavior constraints that can be generated with small amount of numerical calculations is proposed. Structural optimization with these approximate sensitivities produced correct optimum solution. Approximate gradients performed well for different nonlinear programming methods, such as the sequence of unconstrained minimization technique, method of feasible directions, sequence of quadratic programming, and sequence of linear programming. Structural optimization with approximate gradients can reduce by one third the CPU time that would otherwise be required to solve the problem with explicit closed-form gradients. The proposed gradient approximation shows potential to reduce intensive computation that has been associated with traditional structural optimization.
Flat holographic stereograms synthesized from computer-generated images by using LiNbO3 crystal
NASA Astrophysics Data System (ADS)
Qu, Zhi-Min; Liu, Jinsheng; Xu, Liangying
1991-02-01
In this paper we used a novel method for synthesizing computer gene rated images in which by means of a series of intermediate holograms recorded on Fe--doped LiNbO crystals a high quality flat stereograni with wide view angle and much deep 3D image ha been obtained. 2. INTRODUCTITJN As we all know the conventional holography is very limited. With the help of a contineous wave laser only stationary objects can be re corded due tO its insufficient power. Although some moving objects could be recorded by a pulsed laser the dimensions and kinds of object are restricted. If we would like to see a imaginary object or a three dimensional image designed by computer it is very difficult by means of above conventional holography. Of course if we have a two-dimensional image on a comouter screen we can rotate it to give a three-dimensional perspective but we can never really see it as a solid. However flat holographic stereograrns synthesized from computer generated images will make one directly see the comoute results in the form of 3D image. Obviously it will have wide applications in design architecture medicine education and arts. 406 / SPIE Vol. 1238 Three-Dimensional Holography: Science Culture Education (1989)
A Heterogeneous Multiprocessor Graphics System Using Processor-Enhanced Memories
1989-02-01
frames per second, font generation directly from conic spline descriptions, and rapid calculation of radiosity form factors. The hardware consists of...generality for rendering curved surfaces, volume data, objects dcscri id with Constructive Solid Geometry, for rendering scenes using the radiosity ...f.aces and for computing a spherical radiosity lighting model (see Section 7.6). Custom Memory Chips \\ 208 bits x 128 pixels - Renderer Board ix p o a
Probabilistic Methods for Image Generation and Encoding.
1993-10-15
video and graphics lab at Georgia Tech, linking together Silicon Graphics workstations, a laser video recorder, a Betacam video recorder, scanner...computer laboratory at Georgia Tech, based on two Silicon Graphics Personal Iris workstations, a SONY laser video recorder, a SONY Betacam SP video...laser disk in component RGB form, with variable speed playback. From the laser recorder the images can be dubbed to the Betacam or the VHS recorder in
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
The First Stars in the Universe and Cosmic Reionization
NASA Astrophysics Data System (ADS)
Barkana, Rennan
2006-08-01
The earliest generation of stars, far from being a mere novelty, transformed the universe from darkness to light. The first atoms to form after the Big Bang filled the universe with atomic hydrogen and a few light elements. As gravity pulled gas clouds together, the first stars ignited and their radiation turned the surrounding atoms into ions. By looking at gas between us and distant galaxies, we know that this ionization eventually pervaded all space, so that few hydrogen atoms remain today between galaxies. Knowing exactly when and how it did so is a primary goal of cosmologists, because this would tell us when the early stars formed and in what kinds of galaxies. Although this ionization is beginning to be understood by using theoretical models and computer simulations, a new generation of telescopes is being built that will map atomic hydrogen throughout the universe.
Dual-range linearized transimpedance amplifier system
Wessendorf, Kurt O.
2010-11-02
A transimpedance amplifier system is disclosed which simultaneously generates a low-gain output signal and a high-gain output signal from an input current signal using a single transimpedance amplifier having two different feedback loops with different amplification factors to generate two different output voltage signals. One of the feedback loops includes a resistor, and the other feedback loop includes another resistor in series with one or more diodes. The transimpedance amplifier system includes a signal linearizer to linearize one or both of the low- and high-gain output signals by scaling and adding the two output voltage signals from the transimpedance amplifier. The signal linearizer can be formed either as an analog device using one or two summing amplifiers, or alternately can be formed as a digital device using two analog-to-digital converters and a digital signal processor (e.g. a microprocessor or a computer).
Hyperspectral imaging for melanoma screening
NASA Astrophysics Data System (ADS)
Martin, Justin; Krueger, James; Gareau, Daniel
2014-03-01
The 5-year survival rate for patients diagnosed with Melanoma, a deadly form of skin cancer, in its latest stages is about 15%, compared to over 90% for early detection and treatment. We present an imaging system and algorithm that can be used to automatically generate a melanoma risk score to aid clinicians in the early identification of this form of skin cancer. Our system images the patient's skin at a series of different wavelengths and then analyzes several key dermoscopic features to generate this risk score. We have found that shorter wavelengths of light are sensitive to information in the superficial areas of the skin while longer wavelengths can be used to gather information at greater depths. This accompanying diagnostic computer algorithm has demonstrated much higher sensitivity and specificity than the currently commercialized system in preliminary trials and has the potential to improve the early detection of melanoma.
Semantics-based distributed I/O with the ParaMEDIC framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.; Feng, W.; Lin, H.
2008-01-01
Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less
Dauz, Emily; Moore, Jan; Smith, Carol E; Puno, Florence; Schaag, Helen
2004-01-01
This article describes the experiences of nurses who, as part of a large clinical trial, brought the Internet into older adults' homes by installing a computer, if needed, and connecting to a patient education Web site. Most of these patients had not previously used the Internet and were taught even basic computer skills when necessary. Because of increasing use of the Internet in patient education, assessment, and home monitoring, nurses in various roles currently connect with patients to monitor their progress, teach about medications, and answer questions about appointments and treatments. Thus, nurses find themselves playing the role of technology managers for patients with home-based Internet connections. This article provides step-by-step procedures for computer installation and training in the form of protocols, checklists, and patient user guides. By following these procedures, nurses can install computers, arrange Internet access, teach and connect to their patients, and prepare themselves to install future generations of technological devices.
Laser based micro forming and assembly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacCallum, Danny O'Neill; Wong, Chung-Nin Channy; Knorovsky, Gerald Albert
2006-11-01
It has been shown that thermal energy imparted to a metallic substrate by laser heating induces a transient temperature gradient through the thickness of the sample. In favorable conditions of laser fluence and absorptivity, the resulting inhomogeneous thermal strain leads to a measurable permanent deflection. This project established parameters for laser micro forming of thin materials that are relevant to MESA generation weapon system components and confirmed methods for producing micrometer displacements with repeatable bend direction and magnitude. Precise micro forming vectors were realized through computational finite element analysis (FEA) of laser-induced transient heating that indicated the optimal combination ofmore » laser heat input relative to the material being heated and its thermal mass. Precise laser micro forming was demonstrated in two practical manufacturing operations of importance to the DOE complex: micrometer gap adjustments of precious metal alloy contacts and forming of meso scale cones.« less
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Kiuchi, T; Kaihara, S
1997-02-01
The World Wide Web-based form is a promising method for the construction of an on-line data collection system for clinical and epidemiological research. It is, however, laborious to prepare a common gateway interface (CGI) program for each project, which the World Wide Web server needs to handle the submitted data. In medicine, it is even more laborious because the CGI program must check deficits, type, ranges, and logical errors (bad combination of data) of entered data for quality assurance as well as data length and meta-characters of the entered data to enhance the security of the server. We have extended the specification of the hypertext markup language (HTML) form to accommodate information necessary for such data checking and we have developed software named AUTOFORM for this purpose. The software automatically analyzes the extended HTML form and generates the corresponding ordinary HTML form, 'Makefile', and C source of CGI programs. The resultant CGI program checks the entered data through the HTML form, records them in a computer, and returns them to the end-user. AUTOFORM drastically reduces the burden of development of the World Wide Web-based data entry system and allows the CGI programs to be more securely and reliably prepared than had they been written from scratch.
An Excel macro for generating trilinear plots.
Shikaze, Steven G; Crowe, Allan S
2007-01-01
This computer note describes a method for creating trilinear plots in Microsoft Excel. Macros have been created in MS Excel's internal language: Visual Basic for Applications (VBA). A simple form has been set up to allow the user to input data from an Excel worksheet. The VBA macro is used to convert the triangular data (which consist of three columns of percentage data) into X-Y data. The macro then generates the axes, labels, and grid for the trilinear plot. The X-Y data are plotted as scatter data in Excel. By providing this macro in Excel, users can create trilinear plots in a quick, inexpensive manner.
Unconstrained paving and plastering method for generating finite element meshes
Staten, Matthew L.; Owen, Steven J.; Blacker, Teddy D.; Kerr, Robert
2010-03-02
Computer software for and a method of generating a conformal all quadrilateral or hexahedral mesh comprising selecting an object with unmeshed boundaries and performing the following while unmeshed voids are larger than twice a desired element size and unrecognizable as either a midpoint subdividable or pave-and-sweepable polyhedra: selecting a front to advance; based on sizes of fronts and angles with adjacent fronts, determining which adjacent fronts should be advanced with the selected front; advancing the fronts; detecting proximities with other nearby fronts; resolving any found proximities; forming quadrilaterals or unconstrained columns of hexahedra where two layers cross; and establishing hexahedral elements where three layers cross.
Extended Wordsearches in Chemistry
NASA Astrophysics Data System (ADS)
Cotton, Simon
1998-04-01
Students can be encouraged to develop their factual knowledge by use of puzzles. One strategy described here is the extended wordsearch, where the wordsearch element generates a number of words or phrases from which the answers to a series of questions are selected. The wordsearch can be generated with the aid of computer programs, though in order to make them suitable for students with dyslexia or other learning difficulties, a simpler form is more appropriate. These problems can be employed in a variety of contexts, for example, as topic tests and classroom end-of-lesson fillers. An example is provided in the area of calcium chemistry. Sources of suitable software are listed.
Gröbner Bases and Generation of Difference Schemes for Partial Differential Equations
NASA Astrophysics Data System (ADS)
Gerdt, Vladimir P.; Blinkov, Yuri A.; Mozzhilkin, Vladimir V.
2006-05-01
In this paper we present an algorithmic approach to the generation of fully conservative difference schemes for linear partial differential equations. The approach is based on enlargement of the equations in their integral conservation law form by extra integral relations between unknown functions and their derivatives, and on discretization of the obtained system. The structure of the discrete system depends on numerical approximation methods for the integrals occurring in the enlarged system. As a result of the discretization, a system of linear polynomial difference equations is derived for the unknown functions and their partial derivatives. A difference scheme is constructed by elimination of all the partial derivatives. The elimination can be achieved by selecting a proper elimination ranking and by computing a Gröbner basis of the linear difference ideal generated by the polynomials in the discrete system. For these purposes we use the difference form of Janet-like Gröbner bases and their implementation in Maple. As illustration of the described methods and algorithms, we construct a number of difference schemes for Burgers and Falkowich-Karman equations and discuss their numerical properties.
A Unified Dynamic Model for Learning, Replay, and Sharp-Wave/Ripples.
Jahnke, Sven; Timme, Marc; Memmesheimer, Raoul-Martin
2015-12-09
Hippocampal activity is fundamental for episodic memory formation and consolidation. During phases of rest and sleep, it exhibits sharp-wave/ripple (SPW/R) complexes, which are short episodes of increased activity with superimposed high-frequency oscillations. Simultaneously, spike sequences reflecting previous behavior, such as traversed trajectories in space, are replayed. Whereas these phenomena are thought to be crucial for the formation and consolidation of episodic memory, their neurophysiological mechanisms are not well understood. Here we present a unified model showing how experience may be stored and thereafter replayed in association with SPW/Rs. We propose that replay and SPW/Rs are tightly interconnected as they mutually generate and support each other. The underlying mechanism is based on the nonlinear dendritic computation attributable to dendritic sodium spikes that have been prominently found in the hippocampal regions CA1 and CA3, where SPW/Rs and replay are also generated. Besides assigning SPW/Rs a crucial role for replay and thus memory processing, the proposed mechanism also explains their characteristic features, such as the oscillation frequency and the overall wave form. The results shed a new light on the dynamical aspects of hippocampal circuit learning. During phases of rest and sleep, the hippocampus, the "memory center" of the brain, generates intermittent patterns of strongly increased overall activity with high-frequency oscillations, the so-called sharp-wave/ripples. We investigate their role in learning and memory processing. They occur together with replay of activity sequences reflecting previous behavior. Developing a unifying computational model, we propose that both phenomena are tightly linked, by mutually generating and supporting each other. The underlying mechanism depends on nonlinear amplification of synchronous inputs that has been prominently found in the hippocampus. Besides assigning sharp-wave/ripples a crucial role for replay generation and thus memory processing, the proposed mechanism also explains their characteristic features, such as the oscillation frequency and the overall wave form. Copyright © 2015 the authors 0270-6474/15/3516236-23$15.00/0.
Comparison of two Galerkin quadrature methods
Morel, Jim E.; Warsa, James; Franke, Brian C.; ...
2017-02-21
Here, we compare two methods for generating Galerkin quadratures. In method 1, the standard S N method is used to generate the moment-to-discrete matrix and the discrete-to-moment matrix is generated by inverting the moment-to-discrete matrix. This is a particular form of the original Galerkin quadrature method. In method 2, which we introduce here, the standard S N method is used to generate the discrete-to-moment matrix and the moment-to-discrete matrix is generated by inverting the discrete-to-moment matrix. With an N-point quadrature, method 1 has the advantage that it preserves N eigenvalues and N eigenvectors of the scattering operator in a pointwisemore » sense. With an N-point quadrature, method 2 has the advantage that it generates consistent angular moment equations from the corresponding S N equations while preserving N eigenvalues of the scattering operator. Our computational results indicate that these two methods are quite comparable for the test problem considered.« less
Comparison of two Galerkin quadrature methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morel, Jim E.; Warsa, James; Franke, Brian C.
Here, we compare two methods for generating Galerkin quadratures. In method 1, the standard S N method is used to generate the moment-to-discrete matrix and the discrete-to-moment matrix is generated by inverting the moment-to-discrete matrix. This is a particular form of the original Galerkin quadrature method. In method 2, which we introduce here, the standard S N method is used to generate the discrete-to-moment matrix and the moment-to-discrete matrix is generated by inverting the discrete-to-moment matrix. With an N-point quadrature, method 1 has the advantage that it preserves N eigenvalues and N eigenvectors of the scattering operator in a pointwisemore » sense. With an N-point quadrature, method 2 has the advantage that it generates consistent angular moment equations from the corresponding S N equations while preserving N eigenvalues of the scattering operator. Our computational results indicate that these two methods are quite comparable for the test problem considered.« less
Optimal Interpolation scheme to generate reference crop evapotranspiration
NASA Astrophysics Data System (ADS)
Tomas-Burguera, Miquel; Beguería, Santiago; Vicente-Serrano, Sergio; Maneta, Marco
2018-05-01
We used an Optimal Interpolation (OI) scheme to generate a reference crop evapotranspiration (ETo) grid, forcing meteorological variables, and their respective error variance in the Iberian Peninsula for the period 1989-2011. To perform the OI we used observational data from the Spanish Meteorological Agency (AEMET) and outputs from a physically-based climate model. To compute ETo we used five OI schemes to generate grids for the five observed climate variables necessary to compute ETo using the FAO-recommended form of the Penman-Monteith equation (FAO-PM). The granularity of the resulting grids are less sensitive to variations in the density and distribution of the observational network than those generated by other interpolation methods. This is because our implementation of the OI method uses a physically-based climate model as prior background information about the spatial distribution of the climatic variables, which is critical for under-observed regions. This provides temporal consistency in the spatial variability of the climatic fields. We also show that increases in the density and improvements in the distribution of the observational network reduces substantially the uncertainty of the climatic and ETo estimates. Finally, a sensitivity analysis of observational uncertainties and network densification suggests the existence of a trade-off between quantity and quality of observations.
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-03-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. Copyright © 2014 Cognitive Science Society, Inc.
Approximate Solution to the Angular Speeds of a Nearly-Symmetric Mass-Varying Cylindrical Body
NASA Astrophysics Data System (ADS)
Nanjangud, Angadh; Eke, Fidelis
2017-06-01
This paper examines the rotational motion of a nearly axisymmetric rocket type system with uniform burn of its propellant. The asymmetry comes from a slight difference in the transverse principal moments of inertia of the system, which then results in a set of nonlinear equations of motion even when no external torque is applied to the system. It is often difficult, or even impossible, to generate analytic solutions for such equations; closed form solutions are even more difficult to obtain. In this paper, a perturbation-based approach is employed to linearize the equations of motion and generate analytic solutions. The solutions for the variables of transverse motion are analytic and a closed-form solution to the spin rate is suggested. The solutions are presented in a compact form that permits rapid computation. The approximate solutions are then applied to the torque-free motion of a typical solid rocket system and the results are found to agree with those obtained from the numerical solution of the full non-linear equations of motion of the mass varying system.
NASA Astrophysics Data System (ADS)
Bansal, Shonak; Singh, Arun Kumar; Gupta, Neena
2017-02-01
In real-life, multi-objective engineering design problems are very tough and time consuming optimization problems due to their high degree of nonlinearities, complexities and inhomogeneity. Nature-inspired based multi-objective optimization algorithms are now becoming popular for solving multi-objective engineering design problems. This paper proposes original multi-objective Bat algorithm (MOBA) and its extended form, namely, novel parallel hybrid multi-objective Bat algorithm (PHMOBA) to generate shortest length Golomb ruler called optimal Golomb ruler (OGR) sequences at a reasonable computation time. The OGRs found their application in optical wavelength division multiplexing (WDM) systems as channel-allocation algorithm to reduce the four-wave mixing (FWM) crosstalk. The performances of both the proposed algorithms to generate OGRs as optical WDM channel-allocation is compared with other existing classical computing and nature-inspired algorithms, including extended quadratic congruence (EQC), search algorithm (SA), genetic algorithms (GAs), biogeography based optimization (BBO) and big bang-big crunch (BB-BC) optimization algorithms. Simulations conclude that the proposed parallel hybrid multi-objective Bat algorithm works efficiently as compared to original multi-objective Bat algorithm and other existing algorithms to generate OGRs for optical WDM systems. The algorithm PHMOBA to generate OGRs, has higher convergence and success rate than original MOBA. The efficiency improvement of proposed PHMOBA to generate OGRs up to 20-marks, in terms of ruler length and total optical channel bandwidth (TBW) is 100 %, whereas for original MOBA is 85 %. Finally the implications for further research are also discussed.
Fourier transform magnitudes are unique pattern recognition templates.
Gardenier, P H; McCallum, B C; Bates, R H
1986-01-01
Fourier transform magnitudes are commonly used in the generation of templates in pattern recognition applications. We report on recent advances in Fourier phase retrieval which are relevant to pattern recognition. We emphasise in particular that the intrinsic form of a finite, positive image is, in general, uniquely related to the magnitude of its Fourier transform. We state conditions under which the Fourier phase can be reconstructed from samples of the Fourier magnitude, and describe a method of achieving this. Computational examples of restoration of Fourier phase (and hence, by Fourier transformation, the intrinsic form of the image) from samples of the Fourier magnitude are also presented.
NASA Technical Reports Server (NTRS)
Cole, H. A., Jr.
1973-01-01
Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.
Project Listen Compute Show (LCS) - Marine
2004-02-01
Figure 15. Block diagram of a BB-5. Notice the discrete components between the FPGA and the display connection. All of these are scheduled to be... scheduled to form the core of the next generation projection product. This architecture is expected to scale to true HDTV resolution of 1920 by 1080...flight schedule obtained from a SABRE database in order to offer on-time status. We have developed more sophisticated mechanisms for dealing with
Differential modal Zernike wavefront sensor employing a computer-generated hologram: a proposal.
Mishra, Sanjay K; Bhatt, Rahul; Mohan, Devendra; Gupta, Arun Kumar; Sharma, Anurag
2009-11-20
The process of Zernike mode detection with a Shack-Hartmann wavefront sensor is computationally extensive. A holographic modal wavefront sensor has therefore evolved to process the data optically by use of the concept of equal and opposite phase bias. Recently, a multiplexed computer-generated hologram (CGH) technique was developed in which the output is in the form of bright dots that specify the presence and strength of a specific Zernike mode. We propose a wavefront sensor using the concept of phase biasing in the latter technique such that the output is a pair of bright dots for each mode to be sensed. A normalized difference signal between the intensities of the two dots is proportional to the amplitude of the sensed Zernike mode. In our method the number of holograms to be multiplexed is decreased, thereby reducing the modal cross talk significantly. We validated the proposed method through simulation studies for several cases. The simulation results demonstrate simultaneous wavefront detection of lower-order Zernike modes with a resolution better than lambda/50 for the wide measurement range of +/-3.5lambda with much reduced cross talk at high speed.
PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 1: Analysis description
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady
1990-01-01
A new computer code was developed to solve the two-dimensional or axisymmetric, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 1 is the Analysis Description, and describes in detail the governing equations, the turbulence model, the linearization of the equations and boundary conditions, the time and space differencing formulas, the ADI solution procedure, and the artificial viscosity models.
Efficient Optimization of Low-Thrust Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Fink, Wolfgang; Russell, Ryan; Terrile, Richard; Petropoulos, Anastassios; vonAllmen, Paul
2007-01-01
A paper describes a computationally efficient method of optimizing trajectories of spacecraft driven by propulsion systems that generate low thrusts and, hence, must be operated for long times. A common goal in trajectory-optimization problems is to find minimum-time, minimum-fuel, or Pareto-optimal trajectories (here, Pareto-optimality signifies that no other solutions are superior with respect to both flight time and fuel consumption). The present method utilizes genetic and simulated-annealing algorithms to search for globally Pareto-optimal solutions. These algorithms are implemented in parallel form to reduce computation time. These algorithms are coupled with either of two traditional trajectory- design approaches called "direct" and "indirect." In the direct approach, thrust control is discretized in either arc time or arc length, and the resulting discrete thrust vectors are optimized. The indirect approach involves the primer-vector theory (introduced in 1963), in which the thrust control problem is transformed into a co-state control problem and the initial values of the co-state vector are optimized. In application to two example orbit-transfer problems, this method was found to generate solutions comparable to those of other state-of-the-art trajectory-optimization methods while requiring much less computation time.
Designing and using computer simulations in medical education and training: an introduction.
Friedl, Karl E; O'Neil, Harold F
2013-10-01
Computer-based technologies informed by the science of learning are becoming increasingly prevalent in education and training. For the Department of Defense (DoD), this presents a great potential advantage to the effective preparation of a new generation of technologically enabled service members. Military medicine has broad education and training challenges ranging from first aid and personal protective skills for every service member to specialized combat medic training; many of these challenges can be met with gaming and simulation technologies that this new generation has embraced. However, comprehensive use of medical games and simulation to augment expert mentorship is still limited to elite medical provider training programs, but can be expected to become broadly used in the training of first responders and allied health care providers. The purpose of this supplement is to review the use of computer games and simulation to teach and assess medical knowledge and skills. This review and other DoD research policy sources will form the basis for development of a research and development road map and guidelines for use of this technology in military medicine. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
NASA Astrophysics Data System (ADS)
Gervás, Pablo
2016-04-01
Most poetry-generation systems apply opportunistic approaches where algorithmic procedures are applied to explore the conceptual space defined by a given knowledge resource in search of solutions that might be aesthetically valuable. Aesthetical value is assumed to arise from compliance to a given poetic form - such as rhyme or metrical regularity - or from evidence of semantic relations between the words in the resulting poems that can be interpreted as rhetorical tropes - such as similes, analogies, or metaphors. This approach tends to fix a priori the aesthetic parameters of the results, and imposes no constraints on the message to be conveyed. The present paper describes an attempt to initiate a shift in this balance, introducing means for constraining the output to certain topics and allowing a looser mechanism for constraining form. This goal arose as a result of the need to produce poems for a themed collection commissioned to be included in a book. The solution adopted explores an approach to creativity where the goals are not solely aesthetic and where the results may be surprising in their poetic form. An existing computer poet, originally developed to produce poems in a given form but with no specific constraints on their content, is put to the task of producing a set of poems with explicit restrictions on content, and allowing for an exploration of poetic form. Alternative generation methods are devised to overcome the difficulties, and the various insights arising from these new methods and the impact they have on the set of resulting poems are discussed in terms of their potential contribution to better poetry-generation systems.
Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization
NASA Technical Reports Server (NTRS)
Green, Lawrence; Carle, Alan; Fagan, Mike
1999-01-01
Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop limit is reached, or no further design improvement is possible due to active design variable bounds and/or constraints. The resulting shape parameters are then used by the grid generation code to define a new wing surface and computational grid. The lift-to-drag ratio and its gradient are computed for the new design by the automatically-generated adjoint codes. Several optimization iterations may be required to find an optimum wing shape. Results from two sample cases will be discussed. The reader should note that this work primarily represents a demonstration of use of automatically- generated adjoint code within an aerodynamic shape optimization. As such, little significance is placed upon the actual optimization results, relative to the method for obtaining the results.
NASA Glenn Coefficients for Calculating Thermodynamic Properties of Individual Species
NASA Technical Reports Server (NTRS)
McBride, Bonnie J.; Zehe, Michael J.; Gordon, Sanford
2002-01-01
This report documents the library of thermodynamic data used with the NASA Glenn computer program CEA (Chemical Equilibrium with Applications). This library, containing data for over 2000 solid, liquid, and gaseous chemical species for temperatures ranging from 200 to 20,000 K, is available for use with other computer codes as well. The data are expressed as least-squares coefficients to a seven-term functional form for C((sup o)(sub p)) (T) / R with integration constants for H (sup o) (T) / RT and S(sup o) (T) / R. The NASA Glenn computer program PAC (Properties and Coefficients) was used to calculate thermodynamic functions and to generate the least-squares coefficients. PAC input was taken from a variety of sources. A complete listing of the database is given along with a summary of thermodynamic properties at 0 and 298.15 K.
Networking Biology: The Origins of Sequence-Sharing Practices in Genomics.
Stevens, Hallam
2015-10-01
The wide sharing of biological data, especially nucleotide sequences, is now considered to be a key feature of genomics. Historians and sociologists have attempted to account for the rise of this sharing by pointing to precedents in model organism communities and in natural history. This article supplements these approaches by examining the role that electronic networking technologies played in generating the specific forms of sharing that emerged in genomics. The links between early computer users at the Stanford Artificial Intelligence Laboratory in the 1960s, biologists using local computer networks in the 1970s, and GenBank in the 1980s, show how networking technologies carried particular practices of communication, circulation, and data distribution from computing into biology. In particular, networking practices helped to transform sequences themselves into objects that had value as a community resource.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, Bill
Data—lots of data—generated in seconds and piling up on the internet, streaming and stored in countless databases. Big data is important for commerce, society and our nation’s security. Yet the volume, velocity, variety and veracity of data is simply too great for any single analyst to make sense of alone. It requires advanced, data-intensive computing. Simply put, data-intensive computing is the use of sophisticated computers to sort through mounds of information and present analysts with solutions in the form of graphics, scenarios, formulas, new hypotheses and more. This scientific capability is foundational to PNNL’s energy, environment and security missions. Seniormore » Scientist and Division Director Bill Pike and his team are developing analytic tools that are used to solve important national challenges, including cyber systems defense, power grid control systems, intelligence analysis, climate change and scientific exploration.« less
The computational nature of memory modification.
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-03-15
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature.
Simulation of biochemical reactions with time-dependent rates by the rejection-based algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento
We address the problem of simulating biochemical reaction networks with time-dependent rates and propose a new algorithm based on our rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)]. The computation for selecting next reaction firings by our time-dependent RSSA (tRSSA) is computationally efficient. Furthermore, the generated trajectory is exact by exploiting the rejection-based mechanism. We benchmark tRSSA on different biological systems with varying forms of reaction rates to demonstrate its applicability and efficiency. We reveal that for nontrivial cases, the selection of reaction firings in existing algorithms introduces approximations because the integration of reactionmore » rates is very computationally demanding and simplifying assumptions are introduced. The selection of the next reaction firing by our approach is easier while preserving the exactness.« less
NASA Astrophysics Data System (ADS)
Wilson, Katherine E.; Henke, E.-F. Markus; Slipher, Geoffrey A.; Anderson, Iain A.
2017-04-01
Electromechanically coupled dielectric elastomer actuators (DEAs) and dielectric elastomer switches (DESs) may form digital logic circuitry made entirely of soft and flexible materials. The expansion in planar area of a DEA exerts force across a DES, which is a soft electrode with strain-dependent resistivity. When compressed, the DES drops steeply in resistance and changes state from non-conducting to conducting. Logic operators may be achieved with different arrangements of interacting DE actuators and switches. We demonstrate combinatorial logic elements, including the fundamental Boolean logic gates, as well as sequential logic elements, including latches and flip-flops. With both data storage and signal processing abilities, the necessary calculating components of a soft computer are available. A noteworthy advantage of a soft computer with mechanosensitive DESs is the potential for responding to environmental strains while locally processing information and generating a reaction, like a muscle reflex.
Learning a generative model of images by factoring appearance and shape.
Le Roux, Nicolas; Heess, Nicolas; Shotton, Jamie; Winn, John
2011-03-01
Computer vision has grown tremendously in the past two decades. Despite all efforts, existing attempts at matching parts of the human visual system's extraordinary ability to understand visual scenes lack either scope or power. By combining the advantages of general low-level generative models and powerful layer-based and hierarchical models, this work aims at being a first step toward richer, more flexible models of images. After comparing various types of restricted Boltzmann machines (RBMs) able to model continuous-valued data, we introduce our basic model, the masked RBM, which explicitly models occlusion boundaries in image patches by factoring the appearance of any patch region from its shape. We then propose a generative model of larger images using a field of such RBMs. Finally, we discuss how masked RBMs could be stacked to form a deep model able to generate more complicated structures and suitable for various tasks such as segmentation or object recognition.
NASA Technical Reports Server (NTRS)
King, J. C.
1976-01-01
The generation of satellite coverage patterns is facilitated by three basic strategies: use of a simplified physical model, permitting rapid closed-form calculation; separation of earth rotation and nodal precession from initial geometric analyses; and use of symmetries to construct traces of indefinite length by repetitive transposition of basic one-quadrant elements. The complete coverage patterns generated consist of a basic nadir trace plus a number of associated off-nadir traces, one for each sensor swath edge to be delineated. Each trace is generated by transposing one or two of the basic quadrant elements into a circle on a nonrotating earth model sphere, after which the circle is expanded into the actual 'helical' pattern by adding rotational displacements to the longitude coordinates. The procedure adapts to the important periodic coverage cases by direct insertion of the characteristic integers N and R (days and orbital revolutions, respectively, per coverage period).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savic, Vesna; Hector, Louis G.; Ezzat, Hesham
This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less
Implementation of the Sun Position Calculation in the PDC-1 Control Microprocessor
NASA Technical Reports Server (NTRS)
Stallkamp, J. A.
1984-01-01
The several computational approaches to providing the local azimuth and elevation angles of the Sun as a function of local time and then the utilization of the most appropriate method in the PDC-1 microprocessor are presented. The full algorithm, the FORTRAN form, is felt to be very useful in any kind or size of computer. It was used in the PDC-1 unit to generate efficient code for the microprocessor with its floating point arithmetic chip. The balance of the presentation consists of a brief discussion of the tracking requirements for PPDC-1, the planetary motion equations from the first to the final version, and the local azimuth-elevation geometry.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Finite element concepts in computational aerodynamics
NASA Technical Reports Server (NTRS)
Baker, A. J.
1978-01-01
Finite element theory was employed to establish an implicit numerical solution algorithm for the time averaged unsteady Navier-Stokes equations. Both the multidimensional and a time-split form of the algorithm were considered, the latter of particular interest for problem specification on a regular mesh. A Newton matrix iteration procedure is outlined for solving the resultant nonlinear algebraic equation systems. Multidimensional discretization procedures are discussed with emphasis on automated generation of specific nonuniform solution grids and accounting of curved surfaces. The time-split algorithm was evaluated with regards to accuracy and convergence properties for hyperbolic equations on rectangular coordinates. An overall assessment of the viability of the finite element concept for computational aerodynamics is made.
NASA Astrophysics Data System (ADS)
Radziszewski, Kacper
2017-10-01
The following paper presents the results of the research in the field of the machine learning, investigating the scope of application of the artificial neural networks algorithms as a tool in architectural design. The computational experiment was held using the backward propagation of errors method of training the artificial neural network, which was trained based on the geometry of the details of the Roman Corinthian order capital. During the experiment, as an input training data set, five local geometry parameters combined has given the best results: Theta, Pi, Rho in spherical coordinate system based on the capital volume centroid, followed by Z value of the Cartesian coordinate system and a distance from vertical planes created based on the capital symmetry. Additionally during the experiment, artificial neural network hidden layers optimal count and structure was found, giving results of the error below 0.2% for the mentioned before input parameters. Once successfully trained artificial network, was able to mimic the details composition on any other geometry type given. Despite of calculating the transformed geometry locally and separately for each of the thousands of surface points, system could create visually attractive and diverse, complex patterns. Designed tool, based on the supervised learning method of machine learning, gives possibility of generating new architectural forms- free of the designer’s imagination bounds. Implementing the infinitely broad computational methods of machine learning, or Artificial Intelligence in general, not only could accelerate and simplify the design process, but give an opportunity to explore never seen before, unpredictable forms or everyday architectural practice solutions.
Peng, Hanchuan; Tang, Jianyong; Xiao, Hang; Bria, Alessandro; Zhou, Jianlong; Butler, Victoria; Zhou, Zhi; Gonzalez-Bellido, Paloma T; Oh, Seung W; Chen, Jichao; Mitra, Ananya; Tsien, Richard W; Zeng, Hongkui; Ascoli, Giorgio A; Iannello, Giulio; Hawrylycz, Michael; Myers, Eugene; Long, Fuhui
2014-07-11
Three-dimensional (3D) bioimaging, visualization and data analysis are in strong need of powerful 3D exploration techniques. We develop virtual finger (VF) to generate 3D curves, points and regions-of-interest in the 3D space of a volumetric image with a single finger operation, such as a computer mouse stroke, or click or zoom from the 2D-projection plane of an image as visualized with a computer. VF provides efficient methods for acquisition, visualization and analysis of 3D images for roundworm, fruitfly, dragonfly, mouse, rat and human. Specifically, VF enables instant 3D optical zoom-in imaging, 3D free-form optical microsurgery, and 3D visualization and annotation of terabytes of whole-brain image volumes. VF also leads to orders of magnitude better efficiency of automated 3D reconstruction of neurons and similar biostructures over our previous systems. We use VF to generate from images of 1,107 Drosophila GAL4 lines a projectome of a Drosophila brain.
Design, implementation and flight testing of PIF autopilots for general aviation aircraft
NASA Technical Reports Server (NTRS)
Broussard, J. R.
1983-01-01
The designs of Proportional-Integrated-Filter (PIF) auto-pilots for a General Aviation (NAVION) aircraft are presented. The PIF autopilot uses the sampled-data regulator and command generator tracking to determine roll select, pitch select, heading select, altitude select and localizer/glideslope capture and hold autopilot modes. The PIF control law uses typical General Aviation sensors for state feedback, command error integration for command tracking, digital complementary filtering and analog prefiltering for sensor noise suppression, a control filter for computation delay accommodation and the incremental form to eliminate trim values in implementation. Theoretical developments described in detail, were needed to combine the sampled-data regulator with command generator tracking for use as a digital flight control system. The digital PIF autopilots are evaluated using closed-loop eigenvalues and linear simulations. The implementation of the PIF autopilots in a digital flight computer using a high order language (FORTRAN) is briefly described. The successful flight test results for each PIF autopilot mode is presented.
NASA Technical Reports Server (NTRS)
Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.
2002-01-01
For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.
CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients
NASA Technical Reports Server (NTRS)
Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.
2001-01-01
For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.
Methods in Symbolic Computation and p-Adic Valuations of Polynomials
NASA Astrophysics Data System (ADS)
Guan, Xiao
Symbolic computation has widely appear in many mathematical fields such as combinatorics, number theory and stochastic processes. The techniques created in the area of experimental mathematics provide us efficient ways of symbolic computing and verification of complicated relations. Part I consists of three problems. The first one focuses on a unimodal sequence derived from a quartic integral. Many of its properties are explored with the help of hypergeometric representations and automatic proofs. The second problem tackles the generating function of the reciprocal of Catalan number. It springs from the closed form given by Mathematica. Furthermore, three methods in special functions are used to justify this result. The third issue addresses the closed form solutions for the moments of products of generalized elliptic integrals , which combines the experimental mathematics and classical analysis. Part II concentrates on the p-adic valuations of polynomials from the perspective of trees. For a given polynomial f( n) indexed in positive integers, the package developed in Mathematica will create certain tree structure following a couple of rules. The evolution of such trees are studied both rigorously and experimentally from the view of field extension, nonparametric statistics and random matrix.
Universal Linear Motor Driven Leg Press Dynamometer and Concept of Serial Stretch Loading.
Hamar, Dušan
2015-08-24
Paper deals with backgrounds and principles of universal linear motor driven leg press dynamometer and concept of serial stretch loading. The device is based on two computer controlled linear motors mounted to the horizontal rails. As the motors can keep either constant resistance force in selected position or velocity in both directions, the system allows simulation of any mode of muscle contraction. In addition, it also can generate defined serial stretch stimuli in a form of repeated force peaks. This is achieved by short segments of reversed velocity (in concentric phase) or acceleration (in eccentric phase). Such stimuli, generated at the rate of 10 Hz, have proven to be a more efficient means for the improvement of rate of the force development. This capability not only affects performance in many sports, but also plays a substantial role in prevention of falls and their consequences. Universal linear motor driven and computer controlled dynamometer with its unique feature to generate serial stretch stimuli seems to be an efficient and useful tool for enhancing strength training effects on neuromuscular function not only in athletes, but as well as in senior population and rehabilitation patients.
Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES
NASA Astrophysics Data System (ADS)
Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu
2016-11-01
Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.
NASA Astrophysics Data System (ADS)
Jung, Jin Woo; Lee, Jung-Seob; Cho, Dong-Woo
2016-02-01
Recently, much attention has focused on replacement or/and enhancement of biological tissues via the use of cell-laden hydrogel scaffolds with an architecture that mimics the tissue matrix, and with the desired three-dimensional (3D) external geometry. However, mimicking the heterogeneous tissues that most organs and tissues are formed of is challenging. Although multiple-head 3D printing systems have been proposed for fabricating heterogeneous cell-laden hydrogel scaffolds, to date only the simple exterior form has been realized. Here we describe a computer-aided design and manufacturing (CAD/CAM) system for this application. We aim to develop an algorithm to enable easy, intuitive design and fabrication of a heterogeneous cell-laden hydrogel scaffolds with a free-form 3D geometry. The printing paths of the scaffold are automatically generated from the 3D CAD model, and the scaffold is then printed by dispensing four materials; i.e., a frame, two kinds of cell-laden hydrogel and a support. We demonstrated printing of heterogeneous tissue models formed of hydrogel scaffolds using this approach, including the outer ear, kidney and tooth tissue. These results indicate that this approach is particularly promising for tissue engineering and 3D printing applications to regenerate heterogeneous organs and tissues with tailored geometries to treat specific defects or injuries.
Jung, Jin Woo; Lee, Jung-Seob; Cho, Dong-Woo
2016-02-22
Recently, much attention has focused on replacement or/and enhancement of biological tissues via the use of cell-laden hydrogel scaffolds with an architecture that mimics the tissue matrix, and with the desired three-dimensional (3D) external geometry. However, mimicking the heterogeneous tissues that most organs and tissues are formed of is challenging. Although multiple-head 3D printing systems have been proposed for fabricating heterogeneous cell-laden hydrogel scaffolds, to date only the simple exterior form has been realized. Here we describe a computer-aided design and manufacturing (CAD/CAM) system for this application. We aim to develop an algorithm to enable easy, intuitive design and fabrication of a heterogeneous cell-laden hydrogel scaffolds with a free-form 3D geometry. The printing paths of the scaffold are automatically generated from the 3D CAD model, and the scaffold is then printed by dispensing four materials; i.e., a frame, two kinds of cell-laden hydrogel and a support. We demonstrated printing of heterogeneous tissue models formed of hydrogel scaffolds using this approach, including the outer ear, kidney and tooth tissue. These results indicate that this approach is particularly promising for tissue engineering and 3D printing applications to regenerate heterogeneous organs and tissues with tailored geometries to treat specific defects or injuries.
Lu, Chunliang; Su, Xiaoge; Floreancig, Paul E.
2013-01-01
Vinyl ethers can be protonated to generate oxocarbenium ions that react with Me3SiCN to form cyanohydrin alkyl ethers. Reactions that form racemic products proceed efficiently upon converting the vinyl ether to an α-chloro ether prior to cyanide addition in a pathway that proceeds through Brønsted acid-mediated chloride ionization. Enantiomerically enriched products can be accessed by directly protonating the vinyl ether with a chiral Brønsted acid to form a chiral ion pair. Me3SiCN acts as the nucleophile and PhOH serves as a stoichiometric proton source in a rare example of an asymmetric bimolecular nucleophilic addition reaction into an oxocarbenium ion. Computational studies provide a model for the interaction between the catalyst and the oxocarbenium ion. PMID:23968162
Dynamic Modelling Of A SCARA Robot
NASA Astrophysics Data System (ADS)
Turiel, J. Perez; Calleja, R. Grossi; Diez, V. Gutierrez
1987-10-01
This paper describes a method for modelling industrial robots that considers dynamic approach to manipulation systems motion generation, obtaining the complete dynamic model for the mechanic part of the robot and taking into account the dynamic effect of actuators acting at the joints. For a four degree of freedom SCARA robot we obtain the dynamic model for the basic (minimal) configuration, that is, the three degrees of freedom that allow us to place the robot end effector in a desired point, using the Lagrange Method to obtain the dynamic equations in matrix form. The manipulator is considered to be a set of rigid bodies inter-connected by joints in the form of simple kinematic pairs. Then, the state space model is obtained for the actuators that move the robot joints, uniting the models of the single actuators, that is, two DC permanent magnet servomotors and an electrohydraulic actuator. Finally, using a computer simulation program written in FORTRAN language, we can compute the matrices of the complete model.
A mass weighted chemical elastic network model elucidates closed form domain motions in proteins
Kim, Min Hyeok; Seo, Sangjae; Jeong, Jay Il; Kim, Bum Joon; Liu, Wing Kam; Lim, Byeong Soo; Choi, Jae Boong; Kim, Moon Ki
2013-01-01
An elastic network model (ENM), usually Cα coarse-grained one, has been widely used to study protein dynamics as an alternative to classical molecular dynamics simulation. This simple approach dramatically saves the computational cost, but sometimes fails to describe a feasible conformational change due to unrealistically excessive spring connections. To overcome this limitation, we propose a mass-weighted chemical elastic network model (MWCENM) in which the total mass of each residue is assumed to be concentrated on the representative alpha carbon atom and various stiffness values are precisely assigned according to the types of chemical interactions. We test MWCENM on several well-known proteins of which both closed and open conformations are available as well as three α-helix rich proteins. Their normal mode analysis reveals that MWCENM not only generates more plausible conformational changes, especially for closed forms of proteins, but also preserves protein secondary structures thus distinguishing MWCENM from traditional ENMs. In addition, MWCENM also reduces computational burden by using a more sparse stiffness matrix. PMID:23456820
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-13
... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...
Computational Motion Phantoms and Statistical Models of Respiratory Motion
NASA Astrophysics Data System (ADS)
Ehrhardt, Jan; Klinder, Tobias; Lorenz, Cristian
Breathing motion is not a robust and 100 % reproducible process, and inter- and intra-fractional motion variations form an important problem in radiotherapy of the thorax and upper abdomen. A widespread consensus nowadays exists that it would be useful to use prior knowledge about respiratory organ motion and its variability to improve radiotherapy planning and treatment delivery. This chapter discusses two different approaches to model the variability of respiratory motion. In the first part, we review computational motion phantoms, i.e. computerized anatomical and physiological models. Computational phantoms are excellent tools to simulate and investigate the effects of organ motion in radiation therapy and to gain insight into methods for motion management. The second part of this chapter discusses statistical modeling techniques to describe the breathing motion and its variability in a population of 4D images. Population-based models can be generated from repeatedly acquired 4D images of the same patient (intra-patient models) and from 4D images of different patients (inter-patient models). The generation of those models is explained and possible applications of those models for motion prediction in radiotherapy are exemplified. Computational models of respiratory motion and motion variability have numerous applications in radiation therapy, e.g. to understand motion effects in simulation studies, to develop and evaluate treatment strategies or to introduce prior knowledge into the patient-specific treatment planning.
Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.
Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka
2015-01-01
Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.
Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road
Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka
2015-01-01
Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400
On the use of Schwarz-Christoffel conformal mappings to the grid generation for global ocean models
NASA Astrophysics Data System (ADS)
Xu, S.; Wang, B.; Liu, J.
2015-02-01
In this article we propose two conformal mapping based grid generation algorithms for global ocean general circulation models (OGCMs). Contrary to conventional, analytical forms based dipolar or tripolar grids, the new algorithms are based on Schwarz-Christoffel (SC) conformal mapping with prescribed boundary information. While dealing with the basic grid design problem of pole relocation, these new algorithms also address more advanced issues such as smoothed scaling factor, or the new requirements on OGCM grids arisen from the recent trend of high-resolution and multi-scale modeling. The proposed grid generation algorithm could potentially achieve the alignment of grid lines to coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the generated grids are still orthogonal curvilinear, they can be readily utilized in existing Bryan-Cox-Semtner type ocean models. The proposed methodology can also be applied to the grid generation task for regional ocean modeling where complex land-ocean distribution is present.
Su, Chuan-Jun; Chiang, Chang-Yu; Chih, Meng-Chun
2014-03-07
Good physical fitness generally makes the body less prone to common diseases. A personalized exercise plan that promotes a balanced approach to fitness helps promotes fitness, while inappropriate forms of exercise can have adverse consequences for health. This paper aims to develop an ontology-driven knowledge-based system for generating custom-designed exercise plans based on a user's profile and health status, incorporating international standard Health Level Seven International (HL7) data on physical fitness and health screening. The generated plan exposing Representational State Transfer (REST) style web services which can be accessed from any Internet-enabled device and deployed in cloud computing environments. To ensure the practicality of the generated exercise plans, encapsulated knowledge used as a basis for inference in the system is acquired from domain experts. The proposed Ubiquitous Exercise Plan Generation for Personalized Physical Fitness (UFIT) will not only improve health-related fitness through generating personalized exercise plans, but also aid users in avoiding inappropriate work outs.
Su, Chuan-Jun; Chiang, Chang-Yu; Chih, Meng-Chun
2014-01-01
Good physical fitness generally makes the body less prone to common diseases. A personalized exercise plan that promotes a balanced approach to fitness helps promotes fitness, while inappropriate forms of exercise can have adverse consequences for health. This paper aims to develop an ontology-driven knowledge-based system for generating custom-designed exercise plans based on a user's profile and health status, incorporating international standard Health Level Seven International (HL7) data on physical fitness and health screening. The generated plan exposing Representational State Transfer (REST) style web services which can be accessed from any Internet-enabled device and deployed in cloud computing environments. To ensure the practicality of the generated exercise plans, encapsulated knowledge used as a basis for inference in the system is acquired from domain experts. The proposed Ubiquitous Exercise Plan Generation for Personalized Physical Fitness (UFIT) will not only improve health-related fitness through generating personalized exercise plans, but also aid users in avoiding inappropriate work outs. PMID:24608002
Diffeomorphometry and geodesic positioning systems for human anatomy.
Miller, Michael I; Younes, Laurent; Trouvé, Alain
2014-03-01
The Computational Anatomy project has largely been a study of large deformations within a Riemannian framework as an efficient point of view for generating metrics between anatomical configurations. This approach turns D'Arcy Thompson's comparative morphology of human biological shape and form into a metrizable space. Since the metric is constructed based on the geodesic length of the flows of diffeomorphisms connecting the forms, we call it diffeomorphometry . Just as importantly, since the flows describe algebraic group action on anatomical submanifolds and associated functional measurements, they become the basis for positioning information, which we term geodesic positioning . As well the geodesic connections provide Riemannian coordinates for locating forms in the anatomical orbit, which we call geodesic coordinates . These three components taken together - the metric, geodesic positioning of information, and geodesic coordinates - we term the geodesic positioning system . We illustrate via several examples in human and biological coordinate systems and machine learning of the statistical representation of shape and form.
Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization
Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.
2014-01-01
Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406
iFAB Smart Manufacturing Adapting Rapidly to Product Variants (SMARTV)
2012-05-01
of all welds, only one of each can be reached as the angular approach of the robot in its current configuration, with the laser scanner (oriented at...the seam length, the exact trace of the seam can be computed form the intersection point ([X,Y]) of the two lines and their angular bisector ([Θ...php scripts is generated by using the data extracted from plan.xml, filling the appropriate language constructs with this data, and querying the
1978-09-01
generally recognized that the best possible configura- tion for engines operating at high speeds and at high-pressure levels is probably the single...engines is invariably accom- plished by the operation of computer simulation models that generate specific numerical data rather than the generalized re...lationships common to other forms of prime mover based on units of mass or volume. Thus, providing such generalized relation- ships for a Stirling
A Computer Simulation Model of Fluid Flow Through a Channel with Constriction
2013-06-01
separation in blood flow rather than mechanical pressure. While it is very unlikely that there is a net electric charge generated by blood flow, there...gate 8 valve as measured by a mechanical flowmeter. The height of the fluid in the upper reservoir was maintained at a constant level by means of an...Gamani Karunasiri Scott Denardo THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting
2016-12-01
reconstruction of the adult model was originally developed by Kepler et al. (1998) from serial Magnetic Resonance Imaging ( MRI ) sections of the right...upper airways and MRI imaging of a lung cast to form a contiguous reconstruction from the nostrils through 19 airway generations of the lung. For this...and Musante, C. J. (2001). A nonhuman primate aerosol deposition model for toxicological and pharmaceutical studies. Inhal. Toxicol. 13:307-324
2016-12-01
reconstruction of the adult model was originally developed by Kepler et al. (1998) from serial Magnetic Resonance Imaging ( MRI ) sections of the right...upper airways and MRI imaging of a lung cast to form a contiguous reconstruction from the nostrils through 19 airway generations of the lung. For this...and Musante, C. J. (2001). A nonhuman primate aerosol deposition model for toxicological and pharmaceutical studies. Inhal. Toxicol. 13:307-324
Amoeba-Inspired Heuristic Search Dynamics for Exploring Chemical Reaction Paths.
Aono, Masashi; Wakabayashi, Masamitsu
2015-09-01
We propose a nature-inspired model for simulating chemical reactions in a computationally resource-saving manner. The model was developed by extending our previously proposed heuristic search algorithm, called "AmoebaSAT [Aono et al. 2013]," which was inspired by the spatiotemporal dynamics of a single-celled amoeboid organism that exhibits sophisticated computing capabilities in adapting to its environment efficiently [Zhu et al. 2013]. AmoebaSAT is used for solving an NP-complete combinatorial optimization problem [Garey and Johnson 1979], "the satisfiability problem," and finds a constraint-satisfying solution at a speed that is dramatically faster than one of the conventionally known fastest stochastic local search methods [Iwama and Tamaki 2004] for a class of randomly generated problem instances [ http://www.cs.ubc.ca/~hoos/5/benchm.html ]. In cases where the problem has more than one solution, AmoebaSAT exhibits dynamic transition behavior among a variety of the solutions. Inheriting these features of AmoebaSAT, we formulate "AmoebaChem," which explores a variety of metastable molecules in which several constraints determined by input atoms are satisfied and generates dynamic transition processes among the metastable molecules. AmoebaChem and its developed forms will be applied to the study of the origins of life, to discover reaction paths for which expected or unexpected organic compounds may be formed via unknown unstable intermediates and to estimate the likelihood of each of the discovered paths.
Non-harmful insertion of data mimicking computer network attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neil, Joshua Charles; Kent, Alexander; Hash, Jr, Curtis Lee
Non-harmful data mimicking computer network attacks may be inserted in a computer network. Anomalous real network connections may be generated between a plurality of computing systems in the network. Data mimicking an attack may also be generated. The generated data may be transmitted between the plurality of computing systems using the real network connections and measured to determine whether an attack is detected.
NASA Astrophysics Data System (ADS)
Michel, Patrick; Richardson, D. C.
2007-10-01
We have made major improvements in simulations of asteroid disruption by computing explicitly aggregate formations during the gravitational reaccumulation of small fragments, allowing us to obtain information on their spin and shape. First results will be presented taking as examples asteroid families that we reproduced successfully with previous less sophisticated simulations. In the last years, we have simulated successfully the formation of asteroid families using a SPH hydrocode to compute the fragmentation following the impact of a projectile on the parent body, and the N-body code pkdgrav to compute the mutual interactions of the fragments. We found that fragments generated by the disruption of a km-size asteroid can have large enough masses to be attracted by each other during their ejection. Consequently, many reaccumulations take place. Eventually most large fragments correspond to gravitational aggregates formed by reaccumulation of smaller ones. Moreover, formation of satellites occurs around the largest and other big remnants. In these previous simulations, when fragments reaccumulate, they merge into a single sphere whose mass is the sum of their masses. Thus, no information is obtained on the actual shape of the aggregates, their spin, ... For the first time, we have now simulated the disruption of a family parent body by computing explicitly the formation of aggregates, along with the above-mentioned properties. Once formed these aggregates can interact and/or collide with each other and break up during their evolution. We will present these first simulations and their possible implications on properties of asteroids generated by disruption. Results can for instance be compared with data provided by the Japanese space mission Hayabusa of the asteroid Itokawa, a body now understood to be a reaccumulated fragment from a larger parent body. Acknowledgments: PM and DCR acknowledge supports from the French Programme National de Planétologie and grants NSF AST0307549&AST0708110.
Skariyachan, Sinosh; Narayan, Naik Sowmyalaxmi; Aggimath, Tejaswini S; Nagaraj, Sushmitha; Reddy, Monika S; Narayanappa, Rajeswari
2014-03-01
Streptococcus pyogenes is a notorious pathogenic bacterium which causes various human diseases ranging from localized infections to life threatening invasive diseases. Streptolysin-O (SLO), pore-forming thiol-activated cytolysin, is the major virulent factor for streptococcal infections. Present therapies against streptococcal infections are limited as most of the strains have developed multi-drug resistance to present generation of drugs. Hence, there is a need for alternative therapeutic substances. Structure based virtual screening is a novel platform to select lead molecules with better pharmacokinetic properties. The 3D structure of SLO (not available in native form), essential for such studies, was computationally generated and this homology model was used as probable drug target. Based on literature survey, several phytoligands from 25 medicinal plants were selected. Out of these, leads from 11 plants showed better pharmacokinetic properties. The best lead molecules were screened based on computer aided drug likeness and pharmacokinetic predictions. The inhibitory properties of selected herbal leads against SLO were studied by molecular docking. An in vitro assay was further carried out and variations observed were found to be significant (p<0.05). Antibiotic sensitivity testing was also performed with the clinical strain of Streptococcus pyogenes with conventional drugs. The clinical strain showed multi-drug resistance to conventional drugs. Our study revealed that numerous phytoligands have better inhibitory properties towards the toxin. We noticed that incorporation of selected herbal extracts in blood agar medium showed significant reduction in hemolysis (MIC 300μl/plate), indicating inhibition of SLO. Furthermore, the butanol extracts of selected herbal preparation based on computer aided screening showed significant inhibitory properties at 250 mcg/disc concentration. We also noticed that selected herbal formulations have better antimicrobial properties at MIC range of 300- 400μl. Hence, our study suggests that these herbal extracts have better inhibitory properties against the toxin as well as drug resistant Streptococcus pyogenes.
Sohlberg, Karl; Bazargan, Gloria; Angelo, Joseph P; Lee, Choongkeun
2017-01-01
Herein we report a study of the switchable [3]rotaxane reported by Huang et al. (Appl Phys Lett 85(22):5391-5393, 1) that can be mounted to a surface to form a nanomechanical, linear, molecular motor. We demonstrate the application of semiempirical electronic structure theory to predict the average and instantaneous force generated by redox-induced ring shuttling. Detailed analysis of the geometric and electronic structure of the system reveals technical considerations essential to success of the approach. The force is found to be in the 100-200 pN range, consistent with published experimental estimates. Graphical Abstract A single surface-mounted switchable rotaxane.
Development of Creep-Resistant, Alumina-Forming Ferrous Alloys for High-Temperature Structural Use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamamoto, Yukinori; Brady, Michael P.; Muralidharan, Govindarajan
This paper overviews recent advances in developing novel alloy design concepts of creep-resistant, alumina-forming Fe-base alloys, including both ferritic and austenitic steels, for high-temperature structural applications in fossil-fired power generation systems. Protective, external alumina-scales offer improved oxidation resistance compared to chromia-scales in steam-containing environments at elevated temperatures. Alloy design utilizes computational thermodynamic tools with compositional guidelines based on experimental results accumulated in the last decade, along with design and control of the second-phase precipitates to maximize high-temperature strengths. The alloys developed to date, including ferritic (Fe-Cr-Al-Nb-W base) and austenitic (Fe-Cr-Ni-Al-Nb base) alloys, successfully incorporated the balanced properties of steam/water vapor-oxidationmore » and/or ash-corrosion resistance and improved creep strength. Development of cast alumina-forming austenitic (AFA) stainless steel alloys is also in progress with successful improvement of higher temperature capability targeting up to ~1100°C. Current alloy design approach and developmental efforts with guidance of computational tools were found to be beneficial for further development of the new heat resistant steel alloys for various extreme environments.« less
Neuronal avalanches, epileptic quakes and other transient forms of neurodynamics.
Milton, John G
2012-07-01
Power-law behaviors in brain activity in healthy animals, in the form of neuronal avalanches, potentially benefit the computational activities of the brain, including information storage, transmission and processing. In contrast, power-law behaviors associated with seizures, in the form of epileptic quakes, potentially interfere with the brain's computational activities. This review draws attention to the potential roles played by homeostatic mechanisms and multistable time-delayed recurrent inhibitory loops in the generation of power-law phenomena. Moreover, it is suggested that distinctions between health and disease are scale-dependent. In other words, what is abnormal and defines disease it is not the propagation of neural activity but the propagation of activity in a neural population that is large enough to interfere with the normal activities of the brain. From this point of view, epilepsy is a disease that results from a failure of mechanisms, possibly located in part in the cortex itself or in the deep brain nuclei and brainstem, which truncate or otherwise confine the spatiotemporal scales of these power-law phenomena. © 2012 The Author. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.
Quantum machine learning: a classical perspective
NASA Astrophysics Data System (ADS)
Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Rocchetto, Andrea; Severini, Simone; Wossnig, Leonard
2018-01-01
Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed.
NASA Technical Reports Server (NTRS)
Gloss, R. J.
1971-01-01
A finite difference turbulent boundary layer computer program which allows for mass transfer wall cooling and equilibrium chemistry effects is presented. The program is capable of calculating laminar or turbulent boundary layer solutions for an arbitrary ideal gas or an equilibrium hydrogen oxygen system. Either two dimensional or axisymmetric geometric configurations may be considered. The equations are solved, in nondimension-alized physical coordinates, using the implicit Crank-Nicolson technique. The finite difference forms of the conservation of mass, momentum, total enthalpy and elements equations are linearized and uncoupled, thereby generating easily solvable tridiagonal sets of algebraic equations. A detailed description of the computer program, as well as a program user's manual is provided. Detailed descriptions of all boundary layer subroutines are included, as well as a section defining all program symbols of principal importance. Instructions are then given for preparing card input to the program and for interpreting the printed output. Finally, two sample cases are included to illustrate the use of the program.
Quantum machine learning: a classical perspective
Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Severini, Simone; Wossnig, Leonard
2018-01-01
Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed. PMID:29434508
Quantum machine learning: a classical perspective.
Ciliberto, Carlo; Herbster, Mark; Ialongo, Alessandro Davide; Pontil, Massimiliano; Rocchetto, Andrea; Severini, Simone; Wossnig, Leonard
2018-01-01
Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed.
Bacterial computing: a form of natural computing and its applications.
Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C
2014-01-01
The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.
Bacterial computing: a form of natural computing and its applications
Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C.
2014-01-01
The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular “learning” along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems. PMID:24723912
Geometry definition and grid generation for a complete fighter aircraft
NASA Technical Reports Server (NTRS)
Edwards, T. A.
1986-01-01
Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. This paper presents a procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.
Geometry definition and grid generation for a complete fighter aircraft
NASA Technical Reports Server (NTRS)
Edwards, Thomas A.
1986-01-01
Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. A procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system is presented. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.
NASA Astrophysics Data System (ADS)
Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.
2017-11-01
Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.
Hopper, Richard A; Sandercoe, Gavin; Woo, Albert; Watts, Robyn; Kelley, Patrick; Ettinger, Russell E; Saltzman, Babette
2010-11-01
Le Fort III distraction requires generation of bone in the pterygomaxillary region. The authors performed retrospective digital analysis on temporal fine-cut computed tomographic images to quantify both radiographic evidence of pterygomaxillary region bone formation and relative maxillary stability. Fifteen patients with syndromic midface hypoplasia were included in the study. The average age of the patients was 8.7 years; 11 had either Crouzon or Apert syndrome. The average displacement of the maxilla during distraction was 16.2 mm (range, 7 to 31 mm). Digital analysis was performed on fine-cut computed tomographic scans before surgery, at device removal, and at annual follow-up. Seven patients also had mid-consolidation computed tomographic scans. Relative maxillary stability and density of radiographic bone in the pterygomaxillary region were calculated between each scan. There was no evidence of clinically significant maxillary relapse, rotation, or growth between the end of consolidation and 1-year follow-up, other than a relatively small 2-mm subnasal maxillary vertical growth. There was an average radiographic ossification of 0.5 mm/mm advancement at the time of device removal, with a 25th percentile value of 0.3 mm/mm. The time during consolidation that each patient reached the 25th percentile of pterygomaxillary region bone density observed in this series of clinically stable advancements ranged from 1.3 to 9.8 weeks (average, 3.7 weeks). There was high variability in the amount of bone formed in the pterygomaxillary region associated with clinical stability of the advanced Le Fort III segment. These data suggest that a subsection of patients generate the minimal amount of pterygomaxillary region bone formation associated with advancement stability as early as 4 weeks into consolidation.
Two schemes for rapid generation of digital video holograms using PC cluster
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il
2017-12-01
Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.
Stanislawski, Jerzy; Kotulska, Malgorzata; Unold, Olgierd
2013-01-17
Amyloids are proteins capable of forming fibrils. Many of them underlie serious diseases, like Alzheimer disease. The number of amyloid-associated diseases is constantly increasing. Recent studies indicate that amyloidogenic properties can be associated with short segments of aminoacids, which transform the structure when exposed. A few hundreds of such peptides have been experimentally found. Experimental testing of all possible aminoacid combinations is currently not feasible. Instead, they can be predicted by computational methods. 3D profile is a physicochemical-based method that has generated the most numerous dataset - ZipperDB. However, it is computationally very demanding. Here, we show that dataset generation can be accelerated. Two methods to increase the classification efficiency of amyloidogenic candidates are presented and tested: simplified 3D profile generation and machine learning methods. We generated a new dataset of hexapeptides, using more economical 3D profile algorithm, which showed very good classification overlap with ZipperDB (93.5%). The new part of our dataset contains 1779 segments, with 204 classified as amyloidogenic. The dataset of 6-residue sequences with their binary classification, based on the energy of the segment, was applied for training machine learning methods. A separate set of sequences from ZipperDB was used as a test set. The most effective methods were Alternating Decision Tree and Multilayer Perceptron. Both methods obtained area under ROC curve of 0.96, accuracy 91%, true positive rate ca. 78%, and true negative rate 95%. A few other machine learning methods also achieved a good performance. The computational time was reduced from 18-20 CPU-hours (full 3D profile) to 0.5 CPU-hours (simplified 3D profile) to seconds (machine learning). We showed that the simplified profile generation method does not introduce an error with regard to the original method, while increasing the computational efficiency. Our new dataset proved representative enough to use simple statistical methods for testing the amylogenicity based only on six letter sequences. Statistical machine learning methods such as Alternating Decision Tree and Multilayer Perceptron can replace the energy based classifier, with advantage of very significantly reduced computational time and simplicity to perform the analysis. Additionally, a decision tree provides a set of very easily interpretable rules.
TDIGG - TWO-DIMENSIONAL, INTERACTIVE GRID GENERATION CODE
NASA Technical Reports Server (NTRS)
Vu, B. T.
1994-01-01
TDIGG is a fast and versatile program for generating two-dimensional computational grids for use with finite-difference flow-solvers. Both algebraic and elliptic grid generation systems are included. The method for grid generation by algebraic transformation is based on an interpolation algorithm and the elliptic grid generation is established by solving the partial differential equation (PDE). Non-uniform grid distributions are carried out using a hyperbolic tangent stretching function. For algebraic grid systems, interpolations in one direction (univariate) and two directions (bivariate) are considered. These interpolations are associated with linear or cubic Lagrangian/Hermite/Bezier polynomial functions. The algebraic grids can subsequently be smoothed using an elliptic solver. For elliptic grid systems, the PDE can be in the form of Laplace (zero forcing function) or Poisson. The forcing functions in the Poisson equation come from the boundary or the entire domain of the initial algebraic grids. A graphics interface procedure using the Silicon Graphics (GL) Library is included to allow users to visualize the grid variations at each iteration. This will allow users to interactively modify the grid to match their applications. TDIGG is written in FORTRAN 77 for Silicon Graphics IRIS series computers running IRIX. This package requires either MIT's X Window System, Version 11 Revision 4 or SGI (Motif) Window System. A sample executable is provided on the distribution medium. It requires 148K of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. This program was developed in 1992.
A computational model of selection by consequences.
McDowell, J J
2004-05-01
Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied over wide ranges in these experiments, and many of the qualitative features of the model also were varied. The digital organism consistently showed a hyperbolic relation between response and reinforcement rates, and this hyperbolic description of the data was consistently better than the description provided by other, similar, function forms. In addition, the parameters of the hyperbola varied systematically with the quantitative, and some of the qualitative, properties of the model in ways that were consistent with findings from biological organisms. These results suggest that the material events responsible for an organism's responding on RI schedules are computationally equivalent to Darwinian selection by consequences. They also suggest that the computational model developed here is worth pursuing further as a possible dynamic account of behavior.
Amplify scientific discovery with artificial intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gil, Yolanda; Greaves, Mark T.; Hendler, James
Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less
Building an organic computing device with multiple interconnected brains
Pais-Vieira, Miguel; Chiuffa, Gabriela; Lebedev, Mikhail; Yadav, Amol; Nicolelis, Miguel A. L.
2015-01-01
Recently, we proposed that Brainets, i.e. networks formed by multiple animal brains, cooperating and exchanging information in real time through direct brain-to-brain interfaces, could provide the core of a new type of computing device: an organic computer. Here, we describe the first experimental demonstration of such a Brainet, built by interconnecting four adult rat brains. Brainets worked by concurrently recording the extracellular electrical activity generated by populations of cortical neurons distributed across multiple rats chronically implanted with multi-electrode arrays. Cortical neuronal activity was recorded and analyzed in real time, and then delivered to the somatosensory cortices of other animals that participated in the Brainet using intracortical microstimulation (ICMS). Using this approach, different Brainet architectures solved a number of useful computational problems, such as discrete classification, image processing, storage and retrieval of tactile information, and even weather forecasting. Brainets consistently performed at the same or higher levels than single rats in these tasks. Based on these findings, we propose that Brainets could be used to investigate animal social behaviors as well as a test bed for exploring the properties and potential applications of organic computers. PMID:26158615
Engine structures modeling software system: Computer code. User's manual
NASA Technical Reports Server (NTRS)
1992-01-01
ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.
Planning in subsumption architectures
NASA Technical Reports Server (NTRS)
Chalfant, Eugene C.
1994-01-01
A subsumption planner using a parallel distributed computational paradigm based on the subsumption architecture for control of real-world capable robots is described. Virtual sensor state space is used as a planning tool to visualize the robot's anticipated effect on its environment. Decision sequences are generated based on the environmental situation expected at the time the robot must commit to a decision. Between decision points, the robot performs in a preprogrammed manner. A rudimentary, domain-specific partial world model contains enough information to extrapolate the end results of the rote behavior between decision points. A collective network of predictors operates in parallel with the reactive network forming a recurrrent network which generates plans as a hierarchy. Details of a plan segment are generated only when its execution is imminent. The use of the subsumption planner is demonstrated by a simple maze navigation problem.
Far-field coseismic ionospheric disturbances of Tohoku earthquake
NASA Astrophysics Data System (ADS)
Krasnov, V. M.; Drobzheva, Ya. V.; Chum, J.
2015-12-01
A computer code has been developed to simulate the generation of infrasonic waves by a strong earthquake at a distance of 9000 km from the epicenter, their propagation through the atmosphere and their effects in the ionosphere. We provide estimates of the perturbations in the ionosphere at the height (210-220 km) where radiowaves at the sounding frequency (3.595 MHz) of a continuous Doppler radar reflect. Ionospheric perturbations have a global character and amplitudes of 1.5-7.5% of ambient value. Perturbations exist for ~1 h. The form of calculated ionospheric disturbances coincides with the experimental results. The correlation coefficient between calculated and experimental forms was from 0.68 to 0.9.
A computer program to determine the possible daily release window for sky target experiments
NASA Technical Reports Server (NTRS)
Michaud, N. H.
1973-01-01
A computer program is presented which is designed to determine the daily release window for sky target experiments. Factors considered in the program include: (1) target illumination by the sun at release time and during the tracking period; (2) look angle elevation above local horizon from each tracking station to the target; (3) solar depression angle from the local horizon of each tracking station during the experimental period after target release; (4) lunar depression angle from the local horizon of each tracking station during the experimental period after target release; and (5) total sky background brightness as seen from each tracking station while viewing the target. Program output is produced in both graphic and data form. Output data can be plotted for a single calendar month or year. The numerical values used to generate the plots are furnished to permit a more detailed review of the computed daily release windows.
The computational nature of memory modification
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-01-01
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature. DOI: http://dx.doi.org/10.7554/eLife.23763.001 PMID:28294944
Neural-like computing with populations of superparamagnetic basis functions.
Mizrahi, Alice; Hirtzlin, Tifenn; Fukushima, Akio; Kubota, Hitoshi; Yuasa, Shinji; Grollier, Julie; Querlioz, Damien
2018-04-18
In neuroscience, population coding theory demonstrates that neural assemblies can achieve fault-tolerant information processing. Mapped to nanoelectronics, this strategy could allow for reliable computing with scaled-down, noisy, imperfect devices. Doing so requires that the population components form a set of basis functions in terms of their response functions to inputs, offering a physical substrate for computing. Such a population can be implemented with CMOS technology, but the corresponding circuits have high area or energy requirements. Here, we show that nanoscale magnetic tunnel junctions can instead be assembled to meet these requirements. We demonstrate experimentally that a population of nine junctions can implement a basis set of functions, providing the data to achieve, for example, the generation of cursive letters. We design hybrid magnetic-CMOS systems based on interlinked populations of junctions and show that they can learn to realize non-linear variability-resilient transformations with a low imprint area and low power.
NASA Astrophysics Data System (ADS)
Hartmann, Jürgen; Nawroth, Thomas; Dose, Klaus
1984-12-01
Carbodiimide-mediated peptide synthesis in aqueous solution has been studied with respect to self-ordering of amino acids. The copolymerisation of amino acids in the presence of glutamic acid or pyroglutamic acid leads to short pyroglutamyl peptides. Without pyroglutamic acid the formation of higher polymers is favoured. The interactions of the amino acids and the peptides, however, are very complex. Therefore, the experimental results are rather difficult to explain. Some of the experimental results, however, can be explained with the aid of computer simulation programs. Regarding only the tripeptide fraction the copolymerisation of pyroGlu, Ala and Leu, as well as the simulated copolymerisation lead to pyroGlu-Ala-Leu as the main reaction product. The amino acid composition of the insoluble peptides formed during the copolymerisation of Ser, Gly, Ala, Val, Phe, Leu and Ile corresponds in part to the computer-simulated copolymerisation data.
Generation and assessment of turntable SAR data for the support of ATR development
NASA Astrophysics Data System (ADS)
Cohen, Marvin N.; Showman, Gregory A.; Sangston, K. James; Sylvester, Vincent B.; Gostin, Lamar; Scheer, C. Ruby
1998-10-01
Inverse synthetic aperture radar (ISAR) imaging on a turntable-tower test range permits convenient generation of high resolution two-dimensional images of radar targets under controlled conditions for testing SAR image processing and for supporting automatic target recognition (ATR) algorithm development. However, turntable ISAR images are often obtained under near-field geometries and hence may suffer geometric distortions not present in airborne SAR images. In this paper, turntable data collected at Georgia Tech's Electromagnetic Test Facility are used to begin to assess the utility of two- dimensional ISAR imaging algorithms in forming images to support ATR development. The imaging algorithms considered include a simple 2D discrete Fourier transform (DFT), a 2-D DFT with geometric correction based on image domain resampling, and a computationally-intensive geometric matched filter solution. Images formed with the various algorithms are used to develop ATR templates, which are then compared with an eye toward utilization in an ATR algorithm.
Normal forms of Hopf-zero singularity
NASA Astrophysics Data System (ADS)
Gazor, Majid; Mokhtari, Fahimeh
2015-01-01
The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, R.E.
1983-11-01
Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.
Autonomous perception and decision making in cyber-physical systems
NASA Astrophysics Data System (ADS)
Sarkar, Soumik
2011-07-01
The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.
Prietula, M J; Feltovich, P J; Marchak, F
2000-01-01
We propose that considering four categories of task factors can facilitate knowledge elicitation efforts in the analysis of complex cognitive tasks: materials, strategies, knowledge characteristics, and goals. A study was conducted to examine the effects of altering aspects of two of these task categories on problem-solving behavior across skill levels: materials and goals. Two versions of an applied engineering problem were presented to expert, intermediate, and novice participants. Participants were to minimize the cost of running a steam generation facility by adjusting steam generation levels and flows. One version was cast in the form of a dynamic, computer-based simulation that provided immediate feedback on flows, costs, and constraint violations, thus incorporating key variable dynamics of the problem context. The other version was cast as a static computer-based model, with no dynamic components, cost feedback, or constraint checking. Experts performed better than the other groups across material conditions, and, when required, the presentation of the goal assisted the experts more than the other groups. The static group generated richer protocols than the dynamic group, but the dynamic group solved the problem in significantly less time. Little effect of feedback was found for intermediates, and none for novices. We conclude that demonstrating differences in performance in this task requires different materials than explicating underlying knowledge that leads to performance. We also conclude that substantial knowledge is required to exploit the information yielded by the dynamic form of the task or the explicit solution goal. This simple model can help to identify the contextual factors that influence elicitation and specification of knowledge, which is essential in the engineering of joint cognitive systems.
McClelland, Arthur A; Ahn, Seokhoon; Matzger, Adam J; Chen, Zhan
2009-11-17
Sum frequency generation vibrational spectroscopy (SFG) has been applied to study two-dimensional (2D) crystals formed by an isophthalic acid diester on the surface of highly oriented pyrolytic graphite, providing complementary measurements to scanning tunneling microscopy (STM) and computational modeling. SFG results indicate that both aromatic and C=O groups in the 2D crystal tilt from the surface. This study demonstrates that a combination of SFG and STM techniques can be used to gain a more complete picture of 2D crystal structure, and it is necessary to consider solvent-2D crystal interactions and dynamics in the computer models to achieve an accurate representation of interfacial structure.
Techniques for grid manipulation and adaptation. [computational fluid dynamics
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.
1992-01-01
Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.
Benchmarking of neutron production of heavy-ion transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remec, I.; Ronningen, R. M.; Heilbronn, L.
Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
Computerized power supply analysis: State equation generation and terminal models
NASA Technical Reports Server (NTRS)
Garrett, S. J.
1978-01-01
To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.
Incorporating structure from motion uncertainty into image-based pose estimation
NASA Astrophysics Data System (ADS)
Ludington, Ben T.; Brown, Andrew P.; Sheffler, Michael J.; Taylor, Clark N.; Berardi, Stephen
2015-05-01
A method for generating and utilizing structure from motion (SfM) uncertainty estimates within image-based pose estimation is presented. The method is applied to a class of problems in which SfM algorithms are utilized to form a geo-registered reference model of a particular ground area using imagery gathered during flight by a small unmanned aircraft. The model is then used to form camera pose estimates in near real-time from imagery gathered later. The resulting pose estimates can be utilized by any of the other onboard systems (e.g. as a replacement for GPS data) or downstream exploitation systems, e.g., image-based object trackers. However, many of the consumers of pose estimates require an assessment of the pose accuracy. The method for generating the accuracy assessment is presented. First, the uncertainty in the reference model is estimated. Bundle Adjustment (BA) is utilized for model generation. While the high-level approach for generating a covariance matrix of the BA parameters is straightforward, typical computing hardware is not able to support the required operations due to the scale of the optimization problem within BA. Therefore, a series of sparse matrix operations is utilized to form an exact covariance matrix for only the parameters that are needed at a particular moment. Once the uncertainty in the model has been determined, it is used to augment Perspective-n-Point pose estimation algorithms to improve the pose accuracy and to estimate the resulting pose uncertainty. The implementation of the described method is presented along with results including results gathered from flight test data.
Software Surface Modeling and Grid Generation Steering Committee
NASA Technical Reports Server (NTRS)
Smith, Robert E. (Editor)
1992-01-01
It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.
2011-01-01
This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.
McGerald, Genevieve; Dvorkin, Ronald; Levy, David; Lovell-Rose, Stephanie; Sharma, Adhi
2009-06-01
Prescriptions for controlled substances decrease when regulatory barriers are put in place. The converse has not been studied. The objective was to determine whether a less complicated prescription writing process is associated with a change in the prescribing patterns of controlled substances in the emergency department (ED). The authors conducted a retrospective nonconcurrent cohort study of all patients seen in an adult ED between April 19, 2005, and April 18, 2007, who were discharged with a prescription. Prior to April 19, 2006, a specialized prescription form stored in a locked cabinet was obtained from the nursing staff to write a prescription for benzodiazepines or Schedule II opioids. After April 19, 2006, New York State mandated that all prescriptions, regardless of schedule classification, be generated on a specialized bar-coded prescription form. The main outcome of the study was to compare the proportion of Schedule III-V opioids to Schedule II opioids and benzodiazepines prescribed in the ED before and after the introduction of a less cumbersome prescription writing process. Of the 26,638 charts reviewed, 2.1% of the total number of prescriptions generated were for a Schedule II controlled opioid before the new system was implemented compared to 13.6% after (odds ratio [OR] = 7.3, 95% confidence interval [CI] = 6.4 to 8.4). The corresponding percentages for Schedule III-V opioids were 29.9% to 18.1% (OR = 0.52, 95% CI = 0.49 to 0.55) and for benzodiazepines 1.4% to 3.9% (OR = 2.8, 95% CI = 2.4 to 3.4). Patients were more likely to receive a prescription for a Schedule II opioid or a benzodiazepine after a more streamlined computer-generated prescription writing process was introduced in this ED. (c) 2009 by the Society for Academic Emergency Medicine.
Bubble nucleation in simple and molecular liquids via the largest spherical cavity method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, Miguel A., E-mail: m.gonzalez12@imperial.ac.uk; Department of Chemistry, Imperial College London, London SW7 2AZ; Abascal, José L. F.
2015-04-21
In this work, we propose a methodology to compute bubble nucleation free energy barriers using trajectories generated via molecular dynamics simulations. We follow the bubble nucleation process by means of a local order parameter, defined by the volume of the largest spherical cavity (LSC) formed in the nucleating trajectories. This order parameter simplifies considerably the monitoring of the nucleation events, as compared with the previous approaches which require ad hoc criteria to classify the atoms and molecules as liquid or vapor. The combination of the LSC and the mean first passage time technique can then be used to obtain themore » free energy curves. Upon computation of the cavity distribution function the nucleation rate and free-energy barrier can then be computed. We test our method against recent computations of bubble nucleation in simple liquids and water at negative pressures. We obtain free-energy barriers in good agreement with the previous works. The LSC method provides a versatile and computationally efficient route to estimate the volume of critical bubbles the nucleation rate and to compute bubble nucleation free-energies in both simple and molecular liquids.« less
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
Computer Applications in Teaching and Learning.
ERIC Educational Resources Information Center
Halley, Fred S.; And Others
Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…
Automated apparatus and method of generating native code for a stitching machine
NASA Technical Reports Server (NTRS)
Miller, Jeffrey L. (Inventor)
2000-01-01
A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.
Le, Christine M; Sperger, Theresa; Fu, Rui; Hou, Xiao; Lim, Yong Hwan; Schoenebeck, Franziska; Lautens, Mark
2016-11-02
We report a highly robust, general and stereoselective method for the synthesis of 3-(chloromethylene)oxindoles from alkyne-tethered carbamoyl chlorides using PdCl 2 (PhCN) 2 as the catalyst. The transformation involves a stereo- and regioselective chloropalladation of an internal alkyne to generate a nucleophilic vinyl Pd II species, which then undergoes an intramolecular cross-coupling with a carbamoyl chloride. The reaction proceeds under mild conditions, is insensitive to the presence of moisture and air, and is readily scalable. The products obtained from this reaction are formed with >95:5 Z:E selectivity in nearly all cases and can be used to access biologically relevant oxindole cores. Through combined experimental and computational studies, we provide insight into stereo- and regioselectivity of the chloropalladation step, as well as the mechanism for the C-C bond forming process. Calculations provide support for a mechanism involving oxidative addition into the carbamoyl chloride bond to generate a high valent Pd IV species, which then undergoes facile C-C reductive elimination to form the final product. Overall, the transformation constitutes a formal Pd II -catalyzed intramolecular alkyne chlorocarbamoylation reaction.
A simple theory of back-surface-field /BSF/ solar cells
NASA Technical Reports Server (NTRS)
Von Roos, O.
1979-01-01
An earlier calculation of the I-V characteristics of solar cells contains a mistake. The current generated by light within the depletion layer is too large by a factor of 2. When this mistake is corrected, not only are all previous conclusions unchanged, but the agreement with experiment becomes better. Results are presented in graphical form of new computations which not only take account of the factor of 2, but also include more recent data on material parameters.
A new state space model for the NASA/JPL 70-meter antenna servo controls
NASA Technical Reports Server (NTRS)
Hill, R. E.
1987-01-01
A control axis referenced model of the NASA/JPL 70-m antenna structure is combined with the dynamic equations of servo components to produce a comprehansive state variable (matrix) model of the coupled system. An interactive Fortran program for generating the linear system model and computing its salient parameters is described. Results are produced in a state variable, block diagram, and in factored transfer function forms to facilitate design and analysis by classical as well as modern control methods.
NASA Technical Reports Server (NTRS)
Harwood, Kelly; Wickens, Christopher D.
1991-01-01
Computer-generated map displays for NOE and low-level helicopter flight were formed according to prior research on maps, navigational problem solving, and spatial cognition in large-scale environments. The north-up map emphasized consistency of object location, wheareas, the track-up map emphasized map-terrain congruency. A component analysis indicates that different cognitive components, e.g., orienting and absolute object location, are supported to varying degrees by properties of different frames of reference.
Evolution of Micro-Pores in a Single-Crystal Nickel-Based Superalloy During Solution Heat Treatment
NASA Astrophysics Data System (ADS)
Li, Xiangwei; Wang, Li; Dong, Jiasheng; Lou, Langhong; Zhang, Jian
2017-06-01
Evolution of micro-pores in a third-generation single-crystal nickel-based superalloy during solution heat treatment at 1603 K (1330 °C) was investigated by X-ray computed tomography. 3D information including morphology, size, number, and volume fraction of micro-pores formed during solidification (S-pores) and solution (H-pores) was analyzed. The growth behaviors of both S-pores and H-pores can be related to the vacancy formation and diffusion during heat treatment.
NASA Technical Reports Server (NTRS)
Madnia, C. K.; Frankel, S. H.; Givi, P.
1992-01-01
The presently obtained closed-form analytical expressions, which predict the limiting rate of mean reactant conversion in homogeneous turbulent flows under the influence of a binary reaction, are derived via the single-point pdf method based on amplitude mapping closure. With this model, the maximum rate of the mean reactant's decay can be conveniently expressed in terms of definite integrals of the parabolic cylinder functions. The results obtained are shown to be in good agreement with data generated by direct numerical simulations.
ONR Far East Scientific Bulletin, Volume 7, Number 2, April-June 1982,
1982-01-01
contained source code . - PAL (Program Automation Language) PAL is a system design language that automatically generates an executable program from a...NTIS c3&1 DTIC TliB Unn ’l.- A ElJustitt for _ By - Distrib~tion Availability Codes Avail and/or Di st Speojal iii 0- CONTENTS~ P age r’A Gflmpse at...tools exist at ECL in prototype forms. Like most major computer manufacturers, they have also extended high level languages such as FORTRAN , COBOL
NASA Technical Reports Server (NTRS)
Baumann, P. R. (Principal Investigator)
1979-01-01
Three computer quantitative techniques for determining urban land cover patterns are evaluated. The techniques examined deal with the selection of training samples by an automated process, the overlaying of two scenes from different seasons of the year, and the use of individual pixels as training points. Evaluation is based on the number and type of land cover classes generated and the marks obtained from an accuracy test. New Orleans, Louisiana and its environs form the study area.
Stephenson, Jennifer
2009-03-01
Communication symbols for students with severe intellectual disabilities often take the form of computer-generated line drawings. This study investigated the effects of the match between color and shape of line drawings and the objects they represented on drawing recognition and use. The match or non-match between color and shape of the objects and drawings did not have an effect on participants' ability to match drawings to objects, or to use drawings to make choices.
Computational Aerodynamic Analysis of Three-Dimensional Ice Shapes on a NACA 23012 Airfoil
NASA Technical Reports Server (NTRS)
Jun, GaRam; Oliden, Daniel; Potapczuk, Mark G.; Tsao, Jen-Ching
2014-01-01
The present study identifies a process for performing computational fluid dynamic calculations of the flow over full three-dimensional (3D) representations of complex ice shapes deposited on aircraft surfaces. Rime and glaze icing geometries formed on a NACA23012 airfoil were obtained during testing in the NASA Glenn Research Centers Icing Research Tunnel (IRT). The ice shape geometries were scanned as a cloud of data points using a 3D laser scanner. The data point clouds were meshed using Geomagic software to create highly accurate models of the ice surface. The surface data was imported into Pointwise grid generation software to create the CFD surface and volume grids. It was determined that generating grids in Pointwise for complex 3D icing geometries was possible using various techniques that depended on the ice shape. Computations of the flow fields over these ice shapes were performed using the NASA National Combustion Code (NCC). Results for a rime ice shape for angle of attack conditions ranging from 0 to 10 degrees and for freestream Mach numbers of 0.10 and 0.18 are presented. For validation of the computational results, comparisons were made to test results from rapid-prototype models of the selected ice accretion shapes, obtained from a separate study in a subsonic wind tunnel at the University of Illinois at Urbana-Champaign. The computational and experimental results were compared for values of pressure coefficient and lift. Initial results show fairly good agreement for rime ice accretion simulations across the range of conditions examined. The glaze ice results are promising but require some further examination.
Computational Aerodynamic Analysis of Three-Dimensional Ice Shapes on a NACA 23012 Airfoil
NASA Technical Reports Server (NTRS)
Jun, Garam; Oliden, Daniel; Potapczuk, Mark G.; Tsao, Jen-Ching
2014-01-01
The present study identifies a process for performing computational fluid dynamic calculations of the flow over full three-dimensional (3D) representations of complex ice shapes deposited on aircraft surfaces. Rime and glaze icing geometries formed on a NACA23012 airfoil were obtained during testing in the NASA Glenn Research Center's Icing Research Tunnel (IRT). The ice shape geometries were scanned as a cloud of data points using a 3D laser scanner. The data point clouds were meshed using Geomagic software to create highly accurate models of the ice surface. The surface data was imported into Pointwise grid generation software to create the CFD surface and volume grids. It was determined that generating grids in Pointwise for complex 3D icing geometries was possible using various techniques that depended on the ice shape. Computations of the flow fields over these ice shapes were performed using the NASA National Combustion Code (NCC). Results for a rime ice shape for angle of attack conditions ranging from 0 to 10 degrees and for freestream Mach numbers of 0.10 and 0.18 are presented. For validation of the computational results, comparisons were made to test results from rapid-prototype models of the selected ice accretion shapes, obtained from a separate study in a subsonic wind tunnel at the University of Illinois at Urbana-Champaign. The computational and experimental results were compared for values of pressure coefficient and lift. Initial results show fairly good agreement for rime ice accretion simulations across the range of conditions examined. The glaze ice results are promising but require some further examination.
Fang, Yi; Peng, Chen; Guo, Rui; Zheng, Linfeng; Qin, Jinbao; Zhou, Benqing; Shen, Mingwu; Lu, Xinwu; Zhang, Guixiang; Shi, Xiangyang
2013-06-07
We report here a general approach to synthesizing dendrimer-stabilized bismuth sulfide nanoparticles (Bi2S3 DSNPs) for potential computed tomography (CT) imaging applications. In this study, ethylenediamine core glycidol hydroxyl-terminated generation 4 poly(amidoamine) dendrimers (G4.NGlyOH) were used as stabilizers to first complex the Bi(III) ions, followed by reaction with hydrogen sulfide to generate Bi2S3 DSNPs. By varying the molar ratio of Bi atom to dendrimer, stable Bi2S3 DSNPs with an average size range of 5.2-5.7 nm were formed. The formed Bi2S3 DSNPs were characterized via different techniques. X-ray absorption coefficient measurements show that the attenuation of Bi2S3 DSNPs is much higher than that of iodine-based CT contrast agent at the same molar concentration of the active element (Bi versus iodine). 3-(4,5-Dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) cell viability assay and hemolysis assay reveal that the formed Bi2S3 DSNPs are noncytotoxic and have a negligible hemolysis effect in the studied concentration range. Furthermore, we show that cells incubated with the Bi2S3 DSNPs are able to be imaged using CT, a prominent enhancement at the point of rabbit injected subcutaneously with the Bi2S3 DSNPs is able to be visualized via CT scanning, and the mouse's pulmonary vein can be visualized via CT after intravenous injection of the Bi2S3 DSNPs. With the good biocompatibility, enhanced X-ray attenuation property, and tunable dendrimer chemistry, the designed Bi2S3 DSNPs should be able to be further functionalized, allowing them to be used as a highly efficient contrast agent for CT imaging of different biological systems.
Fast generation of computer-generated holograms using wavelet shrinkage.
Shimobaba, Tomoyoshi; Ito, Tomoyoshi
2017-01-09
Computer-generated holograms (CGHs) are generated by superimposing complex amplitudes emitted from a number of object points. However, this superposition process remains very time-consuming even when using the latest computers. We propose a fast calculation algorithm for CGHs that uses a wavelet shrinkage method, eliminating small wavelet coefficient values to express approximated complex amplitudes using only a few representative wavelet coefficients.
Fast solver for large scale eddy current non-destructive evaluation problems
NASA Astrophysics Data System (ADS)
Lei, Naiguang
Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.
Generative Representations for Automated Design of Robots
NASA Technical Reports Server (NTRS)
Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.
2007-01-01
A method of automated design of complex, modular robots involves an evolutionary process in which generative representations of designs are used. The term generative representations as used here signifies, loosely, representations that consist of or include algorithms, computer programs, and the like, wherein encoded designs can reuse elements of their encoding and thereby evolve toward greater complexity. Automated design of robots through synthetic evolutionary processes has already been demonstrated, but it is not clear whether genetically inspired search algorithms can yield designs that are sufficiently complex for practical engineering. The ultimate success of such algorithms as tools for automation of design depends on the scaling properties of representations of designs. A nongenerative representation (one in which each element of the encoded design is used at most once in translating to the design) scales linearly with the number of elements. Search algorithms that use nongenerative representations quickly become intractable (search times vary approximately exponentially with numbers of design elements), and thus are not amenable to scaling to complex designs. Generative representations are compact representations and were devised as means to circumvent the above-mentioned fundamental restriction on scalability. In the present method, a robot is defined by a compact programmatic form (its generative representation) and the evolutionary variation takes place on this form. The evolutionary process is an iterative one, wherein each cycle consists of the following steps: 1. Generative representations are generated in an evolutionary subprocess. 2. Each generative representation is a program that, when compiled, produces an assembly procedure. 3. In a computational simulation, a constructor executes an assembly procedure to generate a robot. 4. A physical-simulation program tests the performance of a simulated constructed robot, evaluating the performance according to a fitness criterion to yield a figure of merit that is fed back into the evolutionary subprocess of the next iteration. In comparison with prior approaches to automated evolutionary design of robots, the use of generative representations offers two advantages: First, a generative representation enables the reuse of components in regular and hierarchical ways and thereby serves a systematic means of creating more complex modules out of simpler ones. Second, the evolved generative representation may capture intrinsic properties of the design problem, so that variations in the representations move through the design space more effectively than do equivalent variations in a nongenerative representation. This method has been demonstrated by using it to design some robots that move, variously, by walking, rolling, or sliding. Some of the robots were built (see figure). Although these robots are very simple, in comparison with robots designed by humans, their structures are more regular, modular, hierarchical, and complex than are those of evolved designs of comparable functionality synthesized by use of nongenerative representations.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Resolving the biophysics of axon transmembrane polarization in a single closed-form description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melendy, Robert F., E-mail: rfmelendy@liberty.edu
2015-12-28
When a depolarizing event occurs across a cell membrane there is a remarkable change in its electrical properties. A complete depolarization event produces a considerably rapid increase in voltage that propagates longitudinally along the axon and is accompanied by changes in axial conductance. A dynamically changing magnetic field is associated with the passage of the action potential down the axon. Over 75 years of research has gone into the quantification of this phenomenon. To date, no unified model exist that resolves transmembrane polarization in a closed-form description. Here, a simple but formative description of propagated signaling phenomena in the membranemore » of an axon is presented in closed-form. The focus is on using both biophysics and mathematical methods for elucidating the fundamental mechanisms governing transmembrane polarization. The results presented demonstrate how to resolve electromagnetic and thermodynamic factors that govern transmembrane potential. Computational results are supported by well-established quantitative descriptions of propagated signaling phenomena in the membrane of an axon. The findings demonstrate how intracellular conductance, the thermodynamics of magnetization, and current modulation function together in generating an action potential in a unified closed-form description. The work presented in this paper provides compelling evidence that three basic factors contribute to the propagated signaling in the membrane of an axon. It is anticipated this work will compel those in biophysics, physical biology, and in the computational neurosciences to probe deeper into the classical and quantum features of membrane magnetization and signaling. It is hoped that subsequent investigations of this sort will be advanced by the computational features of this model without having to resort to numerical methods of analysis.« less
NASA Technical Reports Server (NTRS)
Ho, P. S.; Ellison, M. J.; Quigley, G. J.; Rich, A.
1986-01-01
The ease with which a particular DNA segment adopts the left-handed Z-conformation depends largely on the sequence and on the degree of negative supercoiling to which it is subjected. We describe a computer program (Z-hunt) that is designed to search long sequences of naturally occurring DNA and retrieve those nucleotide combinations of up to 24 bp in length which show a strong propensity for Z-DNA formation. Incorporated into Z-hunt is a statistical mechanical model based on empirically determined energetic parameters for the B to Z transition accumulated to date. The Z-forming potential of a sequence is assessed by ranking its behavior as a function of negative superhelicity relative to the behavior of similar sized randomly generated nucleotide sequences assembled from over 80,000 combinations. The program makes it possible to compare directly the Z-forming potential of sequences with different base compositions and different sequence lengths. Using Z-hunt, we have analyzed the DNA sequences of the bacteriophage phi X174, plasmid pBR322, the animal virus SV40 and the replicative form of the eukaryotic adenovirus-2. The results are compared with those previously obtained by others from experiments designed to locate Z-DNA forming regions in these sequences using probes which show specificity for the left-handed DNA conformation.
Toward a Virtual Town Square in the Era of Web 2.0
NASA Astrophysics Data System (ADS)
Kavanaugh, Andrea; Perez-Quinones, Manuel A.; Tedesco, John C.; Sanders, William
The use of information and communication technology has been leading to foundational changes in democratic society. In the US, new forms of information distribution, citizen discussion and citizen-to-citizen exchange, including content syndication, tagging, and social software, are changing the ways that citizens access information and participate in democratic discussion with other interested citizens as well as government, especially at the local level. We are interested in how local governments and citizens act as agents of change in the community-wide use of social media (also known as Web 2.0). To what extent and for whom does citizen exchange, discussion and collective decision-making supplement offline communication. What is lost in the migration from direct democracy to digital democracy? There are perils as well as opportunities to civic life with the advent of new forms of interaction. Some traditionally politically active participants in the US, such as the older generation, are often uncomfortable with computers. Has their access or participation declined with the migration to electronic forms of government? Conversely, could young adults become more active in civic life through new forms of online social interaction around local or national issues? We report here on changes in civic awareness, political participation, political and collective efficacy, and knowledge sharing among diverse community members based on a decade of research on the social and political use and impact of community-wide computer networking.
Jung, Jin Woo; Lee, Jung-Seob; Cho, Dong-Woo
2016-01-01
Recently, much attention has focused on replacement or/and enhancement of biological tissues via the use of cell-laden hydrogel scaffolds with an architecture that mimics the tissue matrix, and with the desired three-dimensional (3D) external geometry. However, mimicking the heterogeneous tissues that most organs and tissues are formed of is challenging. Although multiple-head 3D printing systems have been proposed for fabricating heterogeneous cell-laden hydrogel scaffolds, to date only the simple exterior form has been realized. Here we describe a computer-aided design and manufacturing (CAD/CAM) system for this application. We aim to develop an algorithm to enable easy, intuitive design and fabrication of a heterogeneous cell-laden hydrogel scaffolds with a free-form 3D geometry. The printing paths of the scaffold are automatically generated from the 3D CAD model, and the scaffold is then printed by dispensing four materials; i.e., a frame, two kinds of cell-laden hydrogel and a support. We demonstrated printing of heterogeneous tissue models formed of hydrogel scaffolds using this approach, including the outer ear, kidney and tooth tissue. These results indicate that this approach is particularly promising for tissue engineering and 3D printing applications to regenerate heterogeneous organs and tissues with tailored geometries to treat specific defects or injuries. PMID:26899876
Manufacturing Magic and Computational Creativity
Williams, Howard; McOwan, Peter W.
2016-01-01
This paper describes techniques in computational creativity, blending mathematical modeling and psychological insight, to generate new magic tricks. The details of an explicit computational framework capable of creating new magic tricks are summarized, and evaluated against a range of contemporary theories about what constitutes a creative system. To allow further development of the proposed system we situate this approach to the generation of magic in the wider context of other areas of application in computational creativity in performance arts. We show how approaches in these domains could be incorporated to enhance future magic generation systems, and critically review possible future applications of such magic generating computers. PMID:27375533
A novel method for designing and fabricating low-cost facepiece prototypes.
Joe, Paula S; Shum, Phillip C; Brown, David W; Lungu, Claudiu T
2014-01-01
In 2010, the National Institute for Occupational Safety and Health (NIOSH) published new digital head form models based on their recently updated fit-test panel. The new panel, based on the 2000 census to better represent the modern work force, created two additional sizes: Short/Wide and Long/Narrow. While collecting the anthropometric data that comprised the panel, additional three-dimensional data were collected on a subset of the subjects. Within each sizing category, five individuals' three-dimensional data were used to create the new head form models. While NIOSH has recommended a switch to a five-size system for designing respirators, little has been done in assessing the potential benefits of this change. With commercially available elastomeric facepieces available in only three or four size systems, it was necessary to develop the facepieces to enable testing. This study aims to develop a method for designing and fabricating elastomeric facepieces tailored to the new head form designs for use in fit-testing studies. This novel method used computed tomography of a solid silicone facepiece and a number of computer-aided design programs (VolView, ParaView, MEGG3D, and RapidForm XOR) to develop a facepiece model to accommodate the Short/Wide head form. The generated model was given a physical form by means of three-dimensional printing using stereolithography (SLA). The printed model was then used to create a silicone mold from which elastomeric prototypes can be cast. The prototype facepieces were cast in two types of silicone for use in future fit-testing.
Tangible Landscape: Cognitively Grasping the Flow of Water
NASA Astrophysics Data System (ADS)
Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.
2016-06-01
Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.
Creating a Computer Adaptive Test Version of the Late-Life Function & Disability Instrument
Jette, Alan M.; Haley, Stephen M.; Ni, Pengsheng; Olarsch, Sippy; Moed, Richard
2009-01-01
Background This study applied Item Response Theory (IRT) and Computer Adaptive Test (CAT) methodologies to develop a prototype function and disability assessment instrument for use in aging research. Herein, we report on the development of the CAT version of the Late-Life Function & Disability instrument (Late-Life FDI) and evaluate its psychometric properties. Methods We employed confirmatory factor analysis, IRT methods, validation, and computer simulation analyses of data collected from 671 older adults residing in residential care facilities. We compared accuracy, precision, and sensitivity to change of scores from CAT versions of two Late-Life FDI scales with scores from the fixed-form instrument. Score estimates from the prototype CAT versus the original instrument were compared in a sample of 40 older adults. Results Distinct function and disability domains were identified within the Late-Life FDI item bank and used to construct two prototype CAT scales. Using retrospective data, scores from computer simulations of the prototype CAT scales were highly correlated with scores from the original instrument. The results of computer simulation, accuracy, precision, and sensitivity to change of the CATs closely approximated those of the fixed-form scales, especially for the 10- or 15-item CAT versions. In the prospective study each CAT was administered in less than 3 minutes and CAT scores were highly correlated with scores generated from the original instrument. Conclusions CAT scores of the Late-Life FDI were highly comparable to those obtained from the full-length instrument with a small loss in accuracy, precision, and sensitivity to change. PMID:19038841
Exorcising the Ghost in the Machine: Synthetic Spectral Data Cubes for Assessing Big Data Algorithms
NASA Astrophysics Data System (ADS)
Araya, M.; Solar, M.; Mardones, D.; Hochfärber, T.
2015-09-01
The size and quantity of the data that is being generated by large astronomical projects like ALMA, requires a paradigm change in astronomical data analysis. Complex data, such as highly sensitive spectroscopic data in the form of large data cubes, are not only difficult to manage, transfer and visualize, but they make traditional data analysis techniques unfeasible. Consequently, the attention has been placed on machine learning and artificial intelligence techniques, to develop approximate and adaptive methods for astronomical data analysis within a reasonable computational time. Unfortunately, these techniques are usually sub optimal, stochastic and strongly dependent of the parameters, which could easily turn into “a ghost in the machine” for astronomers and practitioners. Therefore, a proper assessment of these methods is not only desirable but mandatory for trusting them in large-scale usage. The problem is that positively verifiable results are scarce in astronomy, and moreover, science using bleeding-edge instrumentation naturally lacks of reference values. We propose an Astronomical SYnthetic Data Observations (ASYDO), a virtual service that generates synthetic spectroscopic data in the form of data cubes. The objective of the tool is not to produce accurate astrophysical simulations, but to generate a large number of labelled synthetic data, to assess advanced computing algorithms for astronomy and to develop novel Big Data algorithms. The synthetic data is generated using a set of spectral lines, template functions for spatial and spectral distributions, and simple models that produce reasonable synthetic observations. Emission lines are obtained automatically using IVOA's SLAP protocol (or from a relational database) and their spectral profiles correspond to distributions in the exponential family. The spatial distributions correspond to simple functions (e.g., 2D Gaussian), or to scalable template objects. The intensity, broadening and radial velocity of each line is given by very simple and naive physical models, yet ASYDO's generic implementation supports new user-made models, which potentially allows adding more realistic simulations. The resulting data cube is saved as a FITS file, also including all the tables and images used for generating the cube. We expect to implement ASYDO as a virtual observatory service in the near future.
Making Galaxies: One Star at a Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abel, Tom
2006-09-18
In the age of precision cosmology the fundamental parameters of our world model are being measured to unprecedented accuracy. In particular, measurements of the cosmic microwave background radiation detail the state of the universe only 400,000 years after the big bang. Unfortunately, we have no direct observational evidence about the following few hundred million years, the so called dark ages. However, we do know from the composition of the highest redshift galaxies that it is there where the earliest and first galaxies are being formed. From a physics point of view these earliest times are much easier to understand andmore » model because the chemical composition of the early gas is simpler and the first galaxies are much smaller than the ones found nearby. The absence of strong magnetic fields, cosmic rays, dust grains and UV radiation fields clearly also helps. The first generation of structure formation is as such a problem extremely well suited for direct ab initio calculations using supercomputers. In this colloquium I will discuss the rich physics of the formation of the first objects as computed via ab initio Eulerian cosmological adaptive mesh refinement calculations. We find the first generation of stars to be massive and to form in isolation with mass between 30 and 300 times the mass of the sun. Remarkably the relevant mass scales can all be understood analytically from the microscopic properties of atomic and molecular hydrogen. The UV radiation from these stars photo-evaporates their parent clouds within their lifetimes contributing significantly to cosmological reionization. Their supernovae distribute the first heavy elements over thousands of light years and enrich the intergalactic medium. As we are beginning to illuminate these earliest phases of galaxy formation many new questions arise and become addressable with our novel numerical techniques. How and where are the earliest magnetic fields made? How do the first super-massive black holes form? When and how can the first planets form in the universe? Algorithmic breakthroughs and large supercomputers enable these studies. Hence I will close with discussing how the expanding computing infrastructure at SLAC and scientific visualization at the Schwob Computing and Information Center at the Fred Kavli building allow us to find answers to the fundamental questions about the beginning of structure in the universe.« less
van Pelt, Jaap; Carnell, Andrew; de Ridder, Sander; Mansvelder, Huibert D.; van Ooyen, Arjen
2010-01-01
Neurons make synaptic connections at locations where axons and dendrites are sufficiently close in space. Typically the required proximity is based on the dimensions of dendritic spines and axonal boutons. Based on this principle one can search those locations in networks formed by reconstructed neurons or computer generated neurons. Candidate synapses are then located where axons and dendrites are within a given criterion distance from each other. Both experimentally reconstructed and model generated neurons are usually represented morphologically by piecewise-linear structures (line pieces or cylinders). Proximity tests are then performed on all pairs of line pieces from both axonal and dendritic branches. Applying just a test on the distance between line pieces may result in local clusters of synaptic sites when more than one pair of nearby line pieces from axonal and dendritic branches is sufficient close, and may introduce a dependency on the length scale of the individual line pieces. The present paper describes a new algorithm for defining locations of candidate synapses which is based on the crossing requirement of a line piece pair, while the length of the orthogonal distance between the line pieces is subjected to the distance criterion for testing 3D proximity. PMID:21160548
Phenomenological modeling of nonlinear holograms based on metallic geometric metasurfaces.
Ye, Weimin; Li, Xin; Liu, Juan; Zhang, Shuang
2016-10-31
Benefiting from efficient local phase and amplitude control at the subwavelength scale, metasurfaces offer a new platform for computer generated holography with high spatial resolution. Three-dimensional and high efficient holograms have been realized by metasurfaces constituted by subwavelength meta-atoms with spatially varying geometries or orientations. Metasurfaces have been recently extended to the nonlinear optical regime to generate holographic images in harmonic generation waves. Thus far, there has been no vector field simulation of nonlinear metasurface holograms because of the tremendous computational challenge in numerically calculating the collective nonlinear responses of the large number of different subwavelength meta-atoms in a hologram. Here, we propose a general phenomenological method to model nonlinear metasurface holograms based on the assumption that every meta-atom could be described by a localized nonlinear polarizability tensor. Applied to geometric nonlinear metasurfaces, we numerically model the holographic images formed by the second-harmonic waves of different spins. We show that, in contrast to the metasurface holograms operating in the linear optical regime, the wavelength of incident fundamental light should be slightly detuned from the fundamental resonant wavelength to optimize the efficiency and quality of nonlinear holographic images. The proposed modeling provides a general method to simulate nonlinear optical devices based on metallic metasurfaces.
a Recursive Approach to Compute Normal Forms
NASA Astrophysics Data System (ADS)
HSU, L.; MIN, L. J.; FAVRETTO, L.
2001-06-01
Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.
Social network extraction based on Web: 3. the integrated superficial method
NASA Astrophysics Data System (ADS)
Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.
2018-03-01
The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.
Global magnetosphere simulations using constrained-transport Hall-MHD with CWENO reconstruction
NASA Astrophysics Data System (ADS)
Lin, L.; Germaschewski, K.; Maynard, K. M.; Abbott, S.; Bhattacharjee, A.; Raeder, J.
2013-12-01
We present a new CWENO (Centrally-Weighted Essentially Non-Oscillatory) reconstruction based MHD solver for the OpenGGCM global magnetosphere code. The solver was built using libMRC, a library for creating efficient parallel PDE solvers on structured grids. The use of libMRC gives us access to its core functionality of providing an automated code generation framework which takes a user provided PDE right hand side in symbolic form to generate an efficient, computer architecture specific, parallel code. libMRC also supports block-structured adaptive mesh refinement and implicit-time stepping through integration with the PETSc library. We validate the new CWENO Hall-MHD solver against existing solvers both in standard test problems as well as in global magnetosphere simulations.
Collisional-radiative simulations of a supersonic and radiatively cooled aluminum plasma jet
NASA Astrophysics Data System (ADS)
Espinosa, G.; Gil, J. M.; Rodriguez, R.; Rubiano, J. G.; Mendoza, M. A.; Martel, P.; Minguez, E.; Suzuki-Vidal, F.; Lebedev, S. V.; Swadling, G. F.; Burdiak, G.; Pickworth, L. A.; Skidmore, J.
2015-12-01
A computational investigation based on collisional-radiative simulations of a supersonic and radiatively cooled aluminum plasma jet is presented. The jet, both in vacuum and in argon ambient gas, was produced on the MAGPIE (Mega Ampere Generator for Plasma Implosion Experiments) generator and is formed by ablation of an aluminum foil driven by a 1.4 MA, 250 ns current pulse in a radial foil Z-pinch configuration. In this work, population kinetics and radiative properties simulations of the jet in different theoretical approximations were performed. In particular, local thermodynamic equilibrium (LTE), non-LTE steady state (SS) and non-LTE time dependent (TD) models have been considered. This study allows us to make a convenient microscopic characterization of the aluminum plasma jet.
Nonlinear metamaterials for holography
Almeida, Euclides; Bitton, Ora
2016-01-01
A hologram is an optical element storing phase and possibly amplitude information enabling the reconstruction of a three-dimensional image of an object by illumination and scattering of a coherent beam of light, and the image is generated at the same wavelength as the input laser beam. In recent years, it was shown that information can be stored in nanometric antennas giving rise to ultrathin components. Here we demonstrate nonlinear multilayer metamaterial holograms. A background free image is formed at a new frequency—the third harmonic of the illuminating beam. Using e-beam lithography of multilayer plasmonic nanoantennas, we fabricate polarization-sensitive nonlinear elements such as blazed gratings, lenses and other computer-generated holograms. These holograms are analysed and prospects for future device applications are discussed. PMID:27545581
Differential theory of learning for efficient neural network pattern recognition
NASA Astrophysics Data System (ADS)
Hampshire, John B., II; Vijaya Kumar, Bhagavatula
1993-08-01
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generalize well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
Computer-Generated, Three-Dimensional Character Animation: A Report and Analysis.
ERIC Educational Resources Information Center
Kingsbury, Douglas Lee
This master's thesis details the experience gathered in the production "Snoot and Muttly," a short character animation with 3-D computer generated images, and provides an analysis of the computer-generated 3-D character animation system capabilities. Descriptions are provided of the animation environment at the Ohio State University…
Turbofan noise generation. Volume 2: Computer programs
NASA Technical Reports Server (NTRS)
Ventres, C. S.; Theobald, M. A.; Mark, W. D.
1982-01-01
The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.
Turbofan noise generation. Volume 2: Computer programs
NASA Astrophysics Data System (ADS)
Ventres, C. S.; Theobald, M. A.; Mark, W. D.
1982-07-01
The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.
On the generation and evolution of internal gravity waves
NASA Technical Reports Server (NTRS)
Lansing, F. S.; Maxworthy, T.
1984-01-01
The tidal generation and evolution of internal gravity waves is investigated experimentally and theoretically using a two-dimensional two-layer model. Time-dependent flow is created by moving a profile of maximum submerged depth 7.7 cm through a total stroke of 29 cm in water above a freon-kerosene mixture in an 8.6-m-long 30-cm-deep 20-cm-wide transparent channel, and the deformation of the fluid interface is recorded photographically. A theoretical model of the interface as a set of discrete vortices is constructed numerically; the rigid structures are represented by a source distribution; governing equations in Lagrangian form are obtained; and two integrodifferential equations relating baroclinic vorticity generation and source-density generation are derived. The experimental and computed results are shown in photographs and graphs, respectively, and found to be in good agreement at small Froude numbers. The reasons for small discrepancies in the position of the maximum interface displacement at large Froude numbers are examined.
Lemeshko, Victor V
2016-07-01
Mitochondrial energy in cardiac cells has been reported to be channeled into the cytosol through the intermembrane contact sites formed by the adenine nucleotide translocator, creatine kinase and VDAC. Computational analysis performed in this study showed a high probability of the outer membrane potential (OMP) generation coupled to such a mechanism of energy channeling in respiring mitochondria. OMPs, positive inside, calculated at elevated concentrations of creatine are high enough to restrict ATP release from mitochondria, to significantly decrease the apparent K(m,ADP) for state 3 respiration and to maintain low concentrations of Ca(2+) in the mitochondrial intermembrane space. An inhibition by creatine of Ca(2+)-induced swelling of isolated mitochondria and other protective effects of creatine reported in the literature might be explained by generated positive OMP. We suggest that VDAC-creatine kinase-dependent generation of OMP represents a novel physiological factor controlling metabolic state of mitochondria, cell energy channeling and resistance to death. Copyright © 2016 Elsevier B.V. All rights reserved.
Integrated circuit test-port architecture and method and apparatus of test-port generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teifel, John
A method and apparatus are provided for generating RTL code for a test-port interface of an integrated circuit. In an embodiment, a test-port table is provided as input data. A computer automatically parses the test-port table into data structures and analyzes it to determine input, output, local, and output-enable port names. The computer generates address-detect and test-enable logic constructed from combinational functions. The computer generates one-hot multiplexer logic for at least some of the output ports. The one-hot multiplexer logic for each port is generated so as to enable the port to toggle between data signals and test signals. Themore » computer then completes the generation of the RTL code.« less
Hemodynamics of the Aortic Jet and Implications for Detection of Aortic Stenosis Murmurs
NASA Astrophysics Data System (ADS)
Zhu, Chi; Seo, Junghee; Bakhshaee, Hani; Mittal, Rajat
2016-11-01
Cardiac auscultation with a stethoscope has served as the primary method for qualitative screening of cardiovascular conditions for over a hundred years. However, a lack of quantitative understanding of the flow mechanism(s) responsible for the generation of the murmurs, as well as the effect of intervening tissue on the propagation of these murmurs has been a significant limiting factor in the advancement of automated cardiac auscultation. In this study, a multiphysics computational modeling approach is used to investigate these issues. Direct numerical simulation (DNS) is used to explore the fluid dynamics of the jets formed at the aortic valve and the pressure fluctuations generated by the interaction of this jet with the aortic wall. Subsequently, structural wave propagation in the tissue is resolved by a high-order, linear viscoelastic wave solver in order to explore the propagation of the murmurs through a tissue-like material. The implications of these results for cardiac auscultation are discussed. The authors would like to acknowledge the financial support from NSF Grants IIS-1344772, CBET-1511200, and computational resource by XSEDE NSF Grant TG-CTS100002.
A High Frequency Model of Cascade Noise
NASA Technical Reports Server (NTRS)
Envia, Edmane
1998-01-01
Closed form asymptotic expressions for computing high frequency noise generated by an annular cascade in an infinite duct containing a uniform flow are presented. There are two new elements in this work. First, the annular duct mode representation does not rely on the often-used Bessel function expansion resulting in simpler expressions for both the radial eigenvalues and eigenfunctions of the duct. In particular, the new representation provides an explicit approximate formula for the radial eigenvalues obviating the need for solutions of the transcendental annular duct eigenvalue equation. Also, the radial eigenfunctions are represented in terms of exponentials eliminating the numerical problems associated with generating the Bessel functions on a computer. The second new element is the construction of an unsteady response model for an annular cascade. The new construction satisfies the boundary conditions on both the cascade and duct walls simultaneously adding a new level of realism to the noise calculations. Preliminary results which demonstrate the effectiveness of the new elements are presented. A discussion of the utility of the asymptotic formulas for calculating cascade discrete tone as well as broadband noise is also included.
Elastic properties of dense solid phases of hard cyclic pentamers and heptamers in two dimensions.
Wojciechowski, K W; Tretiakov, K V; Kowalik, M
2003-03-01
Systems of model planar, nonconvex, hard-body "molecules" of fivefold and sevenfold symmetry axes are studied by constant pressure Monte Carlo simulations with variable shape of the periodic box. The molecules, referred to as pentamers (heptamers), are composed of five (seven) identical hard disks "atoms" with centers forming regular pentagons (heptagons) of sides equal to the disk diameter. The elastic compliances of defect-free solid phases are computed by analysis of strain fluctuations and the reference (equilibrium) state is determined within the same run in which the elastic properties are computed. Results obtained by using pseudorandom number generators based on the idea proposed by Holian and co-workers [Holian et al., Phys. Rev. E 50, 1607 (1994)] are in good agreement with the results generated by DRAND48. It is shown that singular behavior of the elastic constants near close packing is in agreement with the free volume approximation; the coefficients of the leading singularities are estimated. The simulations prove that the highest density structures of heptamers (in which the molecules cannot rotate) are auxetic, i.e., show negative Poisson ratios.
A Computational Model for Predicting Gas Breakdown
NASA Astrophysics Data System (ADS)
Gill, Zachary
2017-10-01
Pulsed-inductive discharges are a common method of producing a plasma. They provide a mechanism for quickly and efficiently generating a large volume of plasma for rapid use and are seen in applications including propulsion, fusion power, and high-power lasers. However, some common designs see a delayed response time due to the plasma forming when the magnitude of the magnetic field in the thruster is at a minimum. New designs are difficult to evaluate due to the amount of time needed to construct a new geometry and the high monetary cost of changing the power generation circuit. To more quickly evaluate new designs and better understand the shortcomings of existing designs, a computational model is developed. This model uses a modified single-electron model as the basis for a Mathematica code to determine how the energy distribution in a system changes with regards to time and location. By analyzing this energy distribution, the approximate time and location of initial plasma breakdown can be predicted. The results from this code are then compared to existing data to show its validity and shortcomings. Missouri S&T APLab.
Interactions with Virtual People: Do Avatars Dream of Digital Sheep?. Chapter 6
NASA Technical Reports Server (NTRS)
Slater, Mel; Sanchez-Vives, Maria V.
2007-01-01
This paper explores another form of artificial entity, ones without physical embodiment. We refer to virtual characters as the name for a type of interactive object that have become familiar in computer games and within virtual reality applications. We refer to these as avatars: three-dimensional graphical objects that are in more-or-less human form which can interact with humans. Sometimes such avatars will be representations of real-humans who are interacting together within a shared networked virtual environment, other times the representations will be of entirely computer generated characters. Unlike other authors, who reserve the term agent for entirely computer generated characters and avatars for virtual embodiments of real people; the same term here is used for both. This is because avatars and agents are on a continuum. The question is where does their behaviour originate? At the extremes the behaviour is either completely computer generated or comes only from tracking of a real person. However, not every aspect of a real person can be tracked every eyebrow move, every blink, every breath rather real tracking data would be supplemented by inferred behaviours which are programmed based on the available information as to what the real human is doing and her/his underlying emotional and psychological state. Hence there is always some programmed behaviour it is only a matter of how much. In any case the same underlying problem remains how can the human character be portrayed in such a manner that its actions are believable and have an impact on the real people with whom it interacts? This paper has three main parts. In the first part we will review some evidence that suggests that humans react with appropriate affect in their interactions with virtual human characters, or with other humans who are represented as avatars. This is so in spite of the fact that the representational fidelity is relatively low. Our evidence will be from the realm of psychotherapy, where virtual social situations are created that do test whether people react appropriately within these situations. We will also consider some experiments on face-to-face virtual communications between people in the same shared virtual environments. The second part will try to give some clues about why this might happen, taking into account modern theories of perception from neuroscience. The third part will include some speculations about the future developments of the relationship between people and virtual people. We will suggest that a more likely scenario than the world becoming populated by physically embodied virtual people (robots, androids) is that in the relatively near future we will interact more and more in our everyday lives with virtual people- bank managers, shop assistants, instructors, and so on. What is happening in the movies with computer graphic generated individuals and entire crowds may move into the space of everyday life.
A Protein Chimera Strategy Supports Production of a Model "Difficult-to-Express" Recombinant Target.
Hussain, Hirra; Fisher, David I; Roth, Robert G; Abbott, W Mark; Carballo-Amador, Manuel Alejandro; Warwicker, Jim; Dickson, Alan J
2018-06-22
Due in part to the needs of the biopharmaceutical industry, there has been an increased drive to generate high quality recombinant proteins in large amounts. However, achieving high yields can be a challenge as the novelty and increased complexity of new targets often makes them 'difficult-to-express'. This study aimed to define the molecular features that restrict the production of a model 'difficult-to-express' recombinant protein, Tissue Inhibitor Metalloproteinase-3 (TIMP-3). Building from experimental data, computational approaches were used to rationalise the re-design of this recombinant target to generate a chimera with enhanced secretion. The results highlight the importance of early identification of unfavourable sequence attributes, enabling the generation of engineered protein forms that bypass 'secretory' bottlenecks and result in efficient recombinant protein production. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Computer image generation: Reconfigurability as a strategy in high fidelity space applications
NASA Technical Reports Server (NTRS)
Bartholomew, Michael J.
1989-01-01
The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.
Computer Series, 13: Bits and Pieces, 11.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1982-01-01
Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…
Hobart, J; Thompson, A
2001-01-01
OBJECTIVES—Routine data collection is now considered mandatory. Therefore, staff rated clinical scales that consist of multiple items should have the minimum number of items necessary for rigorous measurement. This study explores the possibility of developing a short form Barthel index, suitable for use in clinical trials, epidemiological studies, and audit, that satisfies criteria for rigorous measurement and is psychometrically equivalent to the 10 item instrument. METHODS—Data were analysed from 844 consecutive admissions to a neurological rehabilitation unit in London. Random half samples were generated. Short forms were developed in one sample (n=419), by selecting items with the best measurement properties, and tested in the other (n=418). For each of the 10 items of the BI, item total correlations and effect sizes were computed and rank ordered. The best items were defined as those with the lowest cross product of these rank orderings. The acceptability, reliability, validity, and responsiveness of three short form BIs (five, four, and three item) were determined and compared with the 10 item BI. Agreement between scores generated by short forms and 10 item BI was determined using intraclass correlation coefficients and the method of Bland and Altman. RESULTS—The five best items in this sample were transfers, bathing, toilet use, stairs, and mobility. Of the three short forms examined, the five item BI had the best measurement properties and was psychometrically equivalent to the 10 item BI. Agreement between scores generated by the two measures for individual patients was excellent (ICC=0.90) but not identical (limits of agreement=1.84±3.84). CONCLUSIONS—The five item short form BI may be a suitable outcome measure for group comparison studies in comparable samples. Further evaluations are needed. Results demonstrate a fundamental difference between assessment and measurement and the importance of incorporating psychometric methods in the development and evaluation of health measures. PMID:11459898
Design preferences and cognitive styles: experimentation by automated website synthesis.
Leung, Siu-Wai; Lee, John; Johnson, Chris; Robertson, David
2012-06-29
This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain.
Kerl, Paul Y; Zhang, Wenxian; Moreno-Cruz, Juan B; Nenes, Athanasios; Realff, Matthew J; Russell, Armistead G; Sokol, Joel; Thomas, Valerie M
2015-09-01
Integrating accurate air quality modeling with decision making is hampered by complex atmospheric physics and chemistry and its coupling with atmospheric transport. Existing approaches to model the physics and chemistry accurately lead to significant computational burdens in computing the response of atmospheric concentrations to changes in emissions profiles. By integrating a reduced form of a fully coupled atmospheric model within a unit commitment optimization model, we allow, for the first time to our knowledge, a fully dynamical approach toward electricity planning that accurately and rapidly minimizes both cost and health impacts. The reduced-form model captures the response of spatially resolved air pollutant concentrations to changes in electricity-generating plant emissions on an hourly basis with accuracy comparable to a comprehensive air quality model. The integrated model allows for the inclusion of human health impacts into cost-based decisions for power plant operation. We use the new capability in a case study of the state of Georgia over the years of 2004-2011, and show that a shift in utilization among existing power plants during selected hourly periods could have provided a health cost savings of $175.9 million dollars for an additional electricity generation cost of $83.6 million in 2007 US dollars (USD2007). The case study illustrates how air pollutant health impacts can be cost-effectively minimized by intelligently modulating power plant operations over multihour periods, without implementing additional emissions control technologies.
Kerl, Paul Y.; Zhang, Wenxian; Moreno-Cruz, Juan B.; Nenes, Athanasios; Realff, Matthew J.; Russell, Armistead G.; Sokol, Joel; Thomas, Valerie M.
2015-01-01
Integrating accurate air quality modeling with decision making is hampered by complex atmospheric physics and chemistry and its coupling with atmospheric transport. Existing approaches to model the physics and chemistry accurately lead to significant computational burdens in computing the response of atmospheric concentrations to changes in emissions profiles. By integrating a reduced form of a fully coupled atmospheric model within a unit commitment optimization model, we allow, for the first time to our knowledge, a fully dynamical approach toward electricity planning that accurately and rapidly minimizes both cost and health impacts. The reduced-form model captures the response of spatially resolved air pollutant concentrations to changes in electricity-generating plant emissions on an hourly basis with accuracy comparable to a comprehensive air quality model. The integrated model allows for the inclusion of human health impacts into cost-based decisions for power plant operation. We use the new capability in a case study of the state of Georgia over the years of 2004–2011, and show that a shift in utilization among existing power plants during selected hourly periods could have provided a health cost savings of $175.9 million dollars for an additional electricity generation cost of $83.6 million in 2007 US dollars (USD2007). The case study illustrates how air pollutant health impacts can be cost-effectively minimized by intelligently modulating power plant operations over multihour periods, without implementing additional emissions control technologies. PMID:26283358
Two-year evaluation indicates zirconia bridges acceptable alternative to PFMs.
Perry, Ronald D; Kugel, Gerard; Sharma, Shradha; Ferreira, Susana; Magnuson, Britta
2012-01-01
The aim of this in-vivo study was to evaluate the 2-year clinical performance of zirconia computer-aided design/computer-aided manufacturing (CAD/CAM)-generated bridges. A total of 16 three- or four-unit Lava zirconia bridges were done on 15 subjects. The bridges were cemented using RelyX™ Unicem Self-Adhesive Universal Resin Cement. Evaluation was done at 6-month, 1-year, and 2-year recall visits. Evaluation criteria were color stability and matching, marginal integrity, marginal discoloration, incidence of caries, changes in restoration-tooth interface, changes in surface texture, postoperative sensitivity, maintenance of periodontal health, changes in proximal and opposing teeth, and maintenance of anatomic form. In each of these parameters, the bridges were rated in one of three possible categories: "A" (alpha)--ideal; "B" (bravo)--acceptable; and "C" (charlie)--unacceptable. After 2 years, 100% of the bridges were rated "A" for color stability and matching, marginal discoloration, incidence of caries, changes in restoration-tooth interface, changes in surface texture, postoperative sensitivity, and change in proximal or opposing teeth. In the parameter of marginal integrity, 6.25% of the bridges were rated "B;" the remaining 93.75% were rated "A." Maintenance of periodontal health was rated "B" for 6.25% of the bridges and "A" for 93.75%. At 2 years, 12.5% of the bridges rated "C" in maintenance of anatomic form and 87.5% rated "A." The overall clinical outcome was that the CAD/CAM-generated zirconia bridges were clinically acceptable.
Apparatus and process for freeform fabrication of composite reinforcement preforms
NASA Technical Reports Server (NTRS)
Yang, Junsheng (Inventor); Wu, Liangwei (Inventor); Liu, Junhai (Inventor); Jang, Bor Z. (Inventor)
2001-01-01
A solid freeform fabrication process and apparatus for making a three-dimensional reinforcement shape. The process comprises the steps of (1) operating a multiple-channel material deposition device for dispensing a liquid adhesive composition and selected reinforcement materials at predetermined proportions onto a work surface; (2) during the material deposition process, moving the deposition device and the work surface relative to each other in an X-Y plane defined by first and second directions and in a Z direction orthogonal to the X-Y plane so that the materials are deposited to form a first layer of the shape; (3) repeating these steps to deposit multiple layers for forming a three-dimensional preform shape; and (4) periodically hardening the adhesive to rigidize individual layers of the preform. These steps are preferably executed under the control of a computer system by taking additional steps of (5) creating a geometry of the shape on the computer with the geometry including a plurality of segments defining the preform shape and each segment being preferably coded with a reinforcement composition defining a specific proportion of different reinforcement materials; (6) generating programmed signals corresponding to each of the segments in a predetermined sequence; and (7) moving the deposition device and the work surface relative to each other in response to these programmed signals. Preferably, the system is also operated to generate a support structure for any un-supported feature of the 3-D preform shape.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
1992-08-01
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
Mesh generation has long been recognized as a bottleneck in the CFD process. While much research on automating the volume mesh generation process have been relatively successful,these methods rely on appropriate initial surface triangulation to work properly. Surface discretization has been one of the least automated steps in computational simulation due to its dependence on implicitly defined CAD surfaces and curves. Differences in CAD peometry engines manifest themselves in discrepancies in their interpretation of the same entities. This lack of "good" geometry causes significant problems for mesh generators, requiring users to "repair" the CAD geometry before mesh generation. The problem is exacerbated when CAD geometry is translated to other forms (e.g., IGES )which do not include important topological and construction information in addition to entity geometry. One technique to avoid these problems is to access the CAD geometry directly from the mesh generating software, rather than through files. By accessing the geometry model (not a discretized version) in its native environment, t h s a proach avoids translation to a format which can deplete the model of topological information. Our approach to enable models developed in the Denali software environment to directly access CAD geometry and functions is through an Application Programming Interface (API) known as CAPRI. CAPRI provides a layer of indirection through which CAD-specific data may be accessed by an application program using CAD-system neutral C and FORTRAN language function calls. CAPRI supports a general set of CAD operations such as truth testing, geometry construction and entity queries.
NCC Simulation Model: Simulating the operations of the network control center, phase 2
NASA Technical Reports Server (NTRS)
Benjamin, Norman M.; Paul, Arthur S.; Gill, Tepper L.
1992-01-01
The simulation of the network control center (NCC) is in the second phase of development. This phase seeks to further develop the work performed in phase one. Phase one concentrated on the computer systems and interconnecting network. The focus of phase two will be the implementation of the network message dialogues and the resources controlled by the NCC. These resources are requested, initiated, monitored and analyzed via network messages. In the NCC network messages are presented in the form of packets that are routed across the network. These packets are generated, encoded, decoded and processed by the network host processors that generate and service the message traffic on the network that connects these hosts. As a result, the message traffic is used to characterize the work done by the NCC and the connected network. Phase one of the model development represented the NCC as a network of bi-directional single server queues and message generating sources. The generators represented the external segment processors. The served based queues represented the host processors. The NCC model consists of the internal and external processors which generate message traffic on the network that links these hosts. To fully realize the objective of phase two it is necessary to identify and model the processes in each internal processor. These processes live in the operating system of the internal host computers and handle tasks such as high speed message exchanging, ISN and NFE interface, event monitoring, network monitoring, and message logging. Inter process communication is achieved through the operating system facilities. The overall performance of the host is determined by its ability to service messages generated by both internal and external processors.
Aerodynamics and flow features of a damselfly in takeoff flight.
Bode-Oke, Ayodeji T; Zeyghami, Samane; Dong, Haibo
2017-09-26
Flight initiation is fundamental for survival, escape from predators and lifting payload from one place to another in biological fliers and can be broadly classified into jumping and non-jumping takeoffs. During jumping takeoffs, the legs generate most of the initial impulse. Whereas the wings generate most of the forces in non-jumping takeoffs, which are usually voluntary, slow, and stable. It is of great interest to understand how these non-jumping takeoffs occur and what strategies insects use to generate large amount of forces required for this highly demanding flight initiation mode. Here, for the first time, we report accurate wing and body kinematics measurements of a damselfly during a non-jumping takeoff. Furthermore, using a high fidelity computational fluid dynamics simulation, we identify the 3D flow features and compute the wing aerodynamics forces to unravel the key mechanisms responsible for generating large flight forces. Our numerical results show that a damselfly generates about three times its body weight during the first half-stroke for liftoff. In generating these forces, the wings flap through a steeply inclined stroke plane with respect to the horizon, slicing through the air at high angles of attack (45°-50°). Consequently, a leading edge vortex (LEV) is formed during both the downstroke and upstroke on all the four wings. The formation of the LEV, however, is inhibited in the subsequent upstrokes following takeoff. Accordingly, we observe a drastic reduction in the magnitude of the aerodynamic force, signifying the importance of LEV in augmenting force production. Our analysis also shows that forewing-hindwing interaction plays a favorable role in enhancing both lift and thrust production during takeoff.
Policy Process Editor for P3BM Software
NASA Technical Reports Server (NTRS)
James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.
2010-01-01
A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.
Transfer to intermediate forms following concept discrimination by pigeons: chimeras and morphs.
Ghosh, Natasha; Lea, Stephen E G; Noury, Malia
2004-01-01
Two experiments examined pigeons' generalization to intermediate forms following training of concept discriminations. In Experiment 1, the training stimuli were sets of images of dogs and cats, and the transfer stimuli were head/body chimeras, which humans tend to categorize more readily in terms of the head part rather than the body part. In Experiment 2, the training stimuli were sets of images of heads of dogs and cats, and the intermediate stimuli were computer-generated morphs. In both experiments, pigeons learned the concept discrimination quickly and generalized with some decrement to novel instances of the categories. In both experiments, transfer tests were carried out with intermediate forms generated from both familiar and novel exemplars of the training sets. In Experiment 1, the pigeons' transfer performance, unlike that of human infants exposed to similar stimuli, was best predicted by the body part of the stimulus when the chimeras were formed from familiar exemplars. Spatial frequency analysis of the stimuli showed that the body parts were richer in high spatial frequencies than the head parts, so these data are consistent with the hypothesis that categorization is more dependent on local stimulus features in pigeons than in humans. There was no corresponding trend when the chimeras were formed from novel exemplars. In Experiment 2, when morphs of training stimuli were used, response rates declined smoothly as the proportion of the morph contributed by the positive stimulus fell, although results with morphs of novel stimuli were again less orderly. PMID:15540501
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Shaobu; Lu, Shuai; Zhou, Ning
In interconnected power systems, dynamic model reduction can be applied on generators outside the area of interest to mitigate the computational cost with transient stability studies. This paper presents an approach of deriving the reduced dynamic model of the external area based on dynamic response measurements, which comprises of three steps, dynamic-feature extraction, attribution and reconstruction (DEAR). In the DEAR approach, a feature extraction technique, such as singular value decomposition (SVD), is applied to the measured generator dynamics after a disturbance. Characteristic generators are then identified in the feature attribution step for matching the extracted dynamic features with the highestmore » similarity, forming a suboptimal ‘basis’ of system dynamics. In the reconstruction step, generator state variables such as rotor angles and voltage magnitudes are approximated with a linear combination of the characteristic generators, resulting in a quasi-nonlinear reduced model of the original external system. Network model is un-changed in the DEAR method. Tests on several IEEE standard systems show that the proposed method gets better reduction ratio and response errors than the traditional coherency aggregation methods.« less
Computational Study of Intramolecular Heterocyclic Ring Formation with Cyclic Phosphazenes.
Miller, Whelton A; Moore, Preston B
2014-08-01
Polyphosphazenes, because of their unique properties, have generated many opportunities to explore a variety of applications. These applications include areas such as biomedical research (e.g. drug delivery) and material science (e.g. fire-resistant polymers). Phosphazenes potentially have more variations then benzene analogues because of different substitution patterns. Here we present A computational study of the chemical modifications to a group of cyclic phosphazenes mainly hexachlorophosphazene (PNCl 2 ) 3 . This study focuses on the relative energies of reactivity of hexachlorophosphazene to understand their geometry and the complexes they likely form. We compare diols, amino alcohols, and diamines with a carbon linker of 1-7 atoms. These heteroatom chains are attached to a single phosphorus atom or adjoining phosphorus atoms to form ring structures of geminal, vicinal (cis), and vicinal (trans) moieties. We find that the reactivities of "heteroatom caps" are predicted to be O,O (diol) > N,O (amino alcohol) > N,N (diamine). These results can be used to predict energetics and thus the stability of new compounds for biomedical and industrial applications.
NASA Technical Reports Server (NTRS)
Caruso, J. J.
1984-01-01
Finite element substructuring is used to predict unidirectional fiber composite hygral (moisture), thermal, and mechanical properties. COSMIC NASTRAN and MSC/NASTRAN are used to perform the finite element analysis. The results obtained from the finite element model are compared with those obtained from the simplified composite micromechanics equations. A unidirectional composite structure made of boron/HM-epoxy, S-glass/IMHS-epoxy and AS/IMHS-epoxy are studied. The finite element analysis is performed using three dimensional isoparametric brick elements and two distinct models. The first model consists of a single cell (one fiber surrounded by matrix) to form a square. The second model uses the single cell and substructuring to form a nine cell square array. To compare computer time and results with the nine cell superelement model, another nine cell model is constructed using conventional mesh generation techniques. An independent computer program consisting of the simplified micromechanics equation is developed to predict the hygral, thermal, and mechanical properties for this comparison. The results indicate that advanced techniques can be used advantageously for fiber composite micromechanics.
Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors
NASA Technical Reports Server (NTRS)
Probst, David K.
1993-01-01
A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.
Entropy generation method to quantify thermal comfort.
Boregowda, S C; Tiwari, S N; Chaturvedi, S K
2001-12-01
The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.
Entropy generation method to quantify thermal comfort
NASA Technical Reports Server (NTRS)
Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.
2001-01-01
The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.
Applications of computer-graphics animation for motion-perception research
NASA Technical Reports Server (NTRS)
Proffitt, D. R.; Kaiser, M. K.
1986-01-01
The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.
The Evolution of Dental Materials for Hybrid Prosthesis
Gonzalez, Jorge
2014-01-01
Since the immemorial, the replacement of missing teeth has been a medical and cosmetic necessity for human kind. Nowadays, middle-aged population groups have experienced improved oral health, as compared to previous generations, and the percentage of edentulous adults can be expected to further decline. However, with the continued increase in the number of older adult population, it is anticipated that the need for some form of full-mouth restoration might increase from 53.8 million in 1991 to 61 million in 2020 [1]. Denture prosthetics has undergone many development stages since the first dentures were fabricated. The introduction of computer-aided design/computer aided manufacturing (CAD/CAM) has resulted in a more accurate manufacturing of prosthetic frameworks, greater accuracy of dental restorations, and in particular, implant supported prosthesis. PMID:24893781
3D analysis of macrosegregation in twin-roll cast AA3003 alloy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Šlapáková, Michaela, E-mail: slapakova@karlov.mff.
Twin-roll cast aluminium alloys have a high potential for industrial applications. However, one of the drawbacks of such materials is an inhomogeneous structure generated by macrosegregation, which appears under certain conditions in the center of sheets during solidification. Segregations in AA3003 alloy form as manganese, iron and silicon rich channels spread in the rolling direction. Their spatial distribution was successfully detected by X-ray computed tomography. Scanning electron microscopy was used for a detailed observation of microstructure, morphology and chemical analysis of the segregation. - Highlights: •Macrosegregations in twin-roll cast sheets stretch along the rolling direction. •X-ray computed tomography is anmore » effective tool for visualization of the segregation. •The segregations copy the shape of grain boundaries.« less
Multiscale mechanobiology: computational models for integrating molecules to multicellular systems
Mak, Michael; Kim, Taeyoon
2015-01-01
Mechanical signals exist throughout the biological landscape. Across all scales, these signals, in the form of force, stiffness, and deformations, are generated and processed, resulting in an active mechanobiological circuit that controls many fundamental aspects of life, from protein unfolding and cytoskeletal remodeling to collective cell motions. The multiple scales and complex feedback involved present a challenge for fully understanding the nature of this circuit, particularly in development and disease in which it has been implicated. Computational models that accurately predict and are based on experimental data enable a means to integrate basic principles and explore fine details of mechanosensing and mechanotransduction in and across all levels of biological systems. Here we review recent advances in these models along with supporting and emerging experimental findings. PMID:26019013
Pilot factors guidelines for the operational inspection of navigation systems
NASA Technical Reports Server (NTRS)
Sadler, J. F.; Boucek, G. P.
1988-01-01
A computerized human engineered inspection technique is developed for use by FAA inspectors in evaluating the pilot factors aspects of aircraft navigation systems. The short title for this project is Nav Handbook. A menu-driven checklist, computer program and data base (Human Factors Design Criteria) were developed and merged to form a self-contained, portable, human factors inspection checklist tool for use in a laboratory or field setting. The automated checklist is tailored for general aviation navigation systems and can be expanded for use with other aircraft systems, transports or military aircraft. The Nav Handbook inspection concept was demonstrated using a lap-top computer and an Omega/VLF CDU. The program generates standardized inspection reports. Automated checklists for LORAN/C and R NAV were also developed. A Nav Handbook User's Guide is included.
Managing the computational chemistry big data problem: the ioChem-BD platform.
Álvarez-Moreno, M; de Graaf, C; López, N; Maseras, F; Poblet, J M; Bo, C
2015-01-26
We present the ioChem-BD platform ( www.iochem-bd.org ) as a multiheaded tool aimed to manage large volumes of quantum chemistry results from a diverse group of already common simulation packages. The platform has an extensible structure. The key modules managing the main tasks are to (i) upload of output files from common computational chemistry packages, (ii) extract meaningful data from the results, and (iii) generate output summaries in user-friendly formats. A heavy use of the Chemical Mark-up Language (CML) is made in the intermediate files used by ioChem-BD. From them and using XSL techniques, we manipulate and transform such chemical data sets to fulfill researchers' needs in the form of HTML5 reports, supporting information, and other research media.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senesi, Andrew; Lee, Byeongdu
Herein, a general method to calculate the scattering functions of polyhedra, including both regular and semi-regular polyhedra, is presented. These calculations may be achieved by breaking a polyhedron into sets of congruent pieces, thereby reducing computation time by taking advantage of Fourier transforms and inversion symmetry. Each piece belonging to a set or subunit can be generated by either rotation or translation. Further, general strategies to compute truncated, concave and stellated polyhedra are provided. Using this method, the asymptotic behaviors of the polyhedral scattering functions are compared with that of a sphere. It is shown that, for a regular polyhedron,more » the form factor oscillation at highqis correlated with the face-to-face distance. In addition, polydispersity affects the Porod constant. The ideas presented herein will be important for the characterization of nanomaterials using small-angle scattering.« less
Materials by numbers: Computations as tools of discovery
Landman, Uzi
2005-01-01
Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210
Point and path performance of light aircraft: A review and analysis
NASA Technical Reports Server (NTRS)
Smetana, F. O.; Summey, D. C.; Johnson, W. D.
1973-01-01
The literature on methods for predicting the performance of light aircraft is reviewed. The methods discussed in the review extend from the classical instantaneous maximum or minimum technique to techniques for generating mathematically optimum flight paths. Classical point performance techniques are shown to be adequate in many cases but their accuracies are compromised by the need to use simple lift, drag, and thrust relations in order to get closed form solutions. Also the investigation of the effect of changes in weight, altitude, configuration, etc. involves many essentially repetitive calculations. Accordingly, computer programs are provided which can fit arbitrary drag polars and power curves with very high precision and which can then use the resulting fits to compute the performance under the assumption that the aircraft is not accelerating.
correlcalc: Two-point correlation function from redshift surveys
NASA Astrophysics Data System (ADS)
Rohin, Yeluripati
2017-11-01
correlcalc calculates two-point correlation function (2pCF) of galaxies/quasars using redshift surveys. It can be used for any assumed geometry or Cosmology model. Using BallTree algorithms to reduce the computational effort for large datasets, it is a parallelised code suitable for running on clusters as well as personal computers. It takes redshift (z), Right Ascension (RA) and Declination (DEC) data of galaxies and random catalogs as inputs in form of ascii or fits files. If random catalog is not provided, it generates one of desired size based on the input redshift distribution and mangle polygon file (in .ply format) describing the survey geometry. It also calculates different realisations of (3D) anisotropic 2pCF. Optionally it makes healpix maps of the survey providing visualization.
Performing process migration with allreduce operations
Archer, Charles Jens; Peters, Amanda; Wallenfelt, Brian Paul
2010-12-14
Compute nodes perform allreduce operations that swap processes at nodes. A first allreduce operation generates a first result and uses a first process from a first compute node, a second process from a second compute node, and zeros from other compute nodes. The first compute node replaces the first process with the first result. A second allreduce operation generates a second result and uses the first result from the first compute node, the second process from the second compute node, and zeros from others. The second compute node replaces the second process with the second result, which is the first process. A third allreduce operation generates a third result and uses the first result from first compute node, the second result from the second compute node, and zeros from others. The first compute node replaces the first result with the third result, which is the second process.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
NASA Technical Reports Server (NTRS)
Taylor, N. L.
1983-01-01
To response to a need for improved computer-generated plots that are acceptable to the Langley publication process, the LaRC Graphics Output System has been modified to encompass the publication requirements, and a guideline has been established. This guideline deals only with the publication requirements of computer-generated plots. This report explains the capability that authors of NASA technical reports can use to obtain publication--quality computer-generated plots or the Langley publication process. The rules applied in developing this guideline and examples illustrating the rules are included.
MSL: A Measure to Evaluate Three-dimensional Patterns in Gene Expression Data
Gutiérrez-Avilés, David; Rubio-Escudero, Cristina
2015-01-01
Microarray technology is highly used in biological research environments due to its ability to monitor the RNA concentration levels. The analysis of the data generated represents a computational challenge due to the characteristics of these data. Clustering techniques are widely applied to create groups of genes that exhibit a similar behavior. Biclustering relaxes the constraints for grouping, allowing genes to be evaluated only under a subset of the conditions. Triclustering appears for the analysis of longitudinal experiments in which the genes are evaluated under certain conditions at several time points. These triclusters provide hidden information in the form of behavior patterns from temporal experiments with microarrays relating subsets of genes, experimental conditions, and time points. We present an evaluation measure for triclusters called Multi Slope Measure, based on the similarity among the angles of the slopes formed by each profile formed by the genes, conditions, and times of the tricluster. PMID:26124630
Understanding of Leaf Development-the Science of Complexity.
Malinowski, Robert
2013-06-25
The leaf is the major organ involved in light perception and conversion of solar energy into organic carbon. In order to adapt to different natural habitats, plants have developed a variety of leaf forms, ranging from simple to compound, with various forms of dissection. Due to the enormous cellular complexity of leaves, understanding the mechanisms regulating development of these organs is difficult. In recent years there has been a dramatic increase in the use of technically advanced imaging techniques and computational modeling in studies of leaf development. Additionally, molecular tools for manipulation of morphogenesis were successfully used for in planta verification of developmental models. Results of these interdisciplinary studies show that global growth patterns influencing final leaf form are generated by cooperative action of genetic, biochemical, and biomechanical inputs. This review summarizes recent progress in integrative studies on leaf development and illustrates how intrinsic features of leaves (including their cellular complexity) influence the choice of experimental approach.
Understanding of Leaf Development—the Science of Complexity
Malinowski, Robert
2013-01-01
The leaf is the major organ involved in light perception and conversion of solar energy into organic carbon. In order to adapt to different natural habitats, plants have developed a variety of leaf forms, ranging from simple to compound, with various forms of dissection. Due to the enormous cellular complexity of leaves, understanding the mechanisms regulating development of these organs is difficult. In recent years there has been a dramatic increase in the use of technically advanced imaging techniques and computational modeling in studies of leaf development. Additionally, molecular tools for manipulation of morphogenesis were successfully used for in planta verification of developmental models. Results of these interdisciplinary studies show that global growth patterns influencing final leaf form are generated by cooperative action of genetic, biochemical, and biomechanical inputs. This review summarizes recent progress in integrative studies on leaf development and illustrates how intrinsic features of leaves (including their cellular complexity) influence the choice of experimental approach. PMID:27137383
de Jong, Wibe A; Walker, Andrew M; Hanwell, Marcus D
2013-05-24
Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple "Google-style" searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature.
2013-01-01
Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. Conclusions The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple “Google-style” searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature. PMID:23705910
Quasicrystals and Quantum Computing
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
1997-03-01
In Quantum (Q) Computing qubits form Q-superpositions for macroscopic times. One scheme for ultra-fast (Q) computing can be based on quasicrystals. Ultrafast processing in Q-coherent structures (and the very existence of durable Q-superpositions) may be 'consequence' of presence of entire manifold of integer arithmetic (A0, aleph-naught of Georg Cantor) at any 4-point of space-time, furthermore, at any point of any multidimensional phase space of (any) N-particle Q-system. The latter, apart from quasicrystals, can include dispersed and/or diluted systems (Berezin, 1994). In such systems such alleged centrepieces of Q-Computing as ability for fast factorization of long integers can be processed by sheer virtue of the fact that entire infinite pattern of prime numbers is instantaneously available as 'free lunch' at any instant/point. Infinitely rich pattern of A0 (including pattern of primes and almost primes) acts as 'independent' physical effect which directly generates Q-dynamics (and physical world) 'out of nothing'. Thus Q-nonlocality can be ultimately based on instantaneous interconnectedness through ever- the-same structure of A0 ('Platonic field' of integers).
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
A Discrete Global Grid System Programming Language Using MapReduce
NASA Astrophysics Data System (ADS)
Peterson, P.; Shatz, I.
2016-12-01
A discrete global grid system (DGGS) is a powerful mechanism for storing and integrating geospatial information. As a "pixelization" of the Earth, many image processing techniques lend themselves to the transformation of data values referenced to the DGGS cells. It has been shown that image algebra, as an example, and advanced algebra, like Fast Fourier Transformation, can be used on the DGGS tiling structure for geoprocessing and spatial analysis. MapReduce has been shown to provide advantages for processing and generating large data sets within distributed and parallel computing. The DGGS structure is ideally suited for big distributed Earth data. We proposed that basic expressions could be created to form the atoms of a generalized DGGS language using the MapReduce programming model. We created three very efficient expressions: Selectors (aka filter) - A selection function that generate a set of cells, cell collections, or geometries; Calculators (aka map) - A computational function (including quantization of raw measurements and data sources) that generate values in a DGGS cell; and Aggregators (aka reduce) - A function that generate spatial statistics from cell values within a cell. We found that these three basic MapReduce operations along with a forth function, the Iterator, for horizontal and vertical traversing of any DGGS structure, provided simple building block resulting in very efficient operations and processes that could be used with any DGGS. We provide examples and a demonstration of their effectiveness using the ISEA3H DGGS on the PYXIS Studio.
Computing aerodynamic sound using advanced statistical turbulence theories
NASA Technical Reports Server (NTRS)
Hecht, A. M.; Teske, M. E.; Bilanin, A. J.
1981-01-01
It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.