Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
Optimizing IV and V for Mature Organizations
NASA Technical Reports Server (NTRS)
Fuhman, Christopher
2003-01-01
NASA is intending for its future software development agencies to have at least a Level 3 rating in the Carnegie Mellon University Capability Maturity Model (CMM). The CMM has built-in Verification and Validation (V&V) processes that support higher software quality. Independent Verification and Validation (IV&V) of software developed by mature agencies can be therefore more effective than for software developed by less mature organizations. How is Independent V&V different with respect to the maturity of an organization? Knowing a priori the maturity of an organization's processes, how can IV&V planners better identify areas of need choose IV&V activities, etc? The objective of this research is to provide a complementary set of guidelines and criteria to assist the planning of IV&V activities on a project using a priori knowledge of the measurable levels of maturity of the organization developing the software.
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
Process Improvement Should Link to Security: SEPG 2007 Security Track Recap
2007-09-01
the Systems Security Engineering Capability Maturity Model (SSE- CMM / ISO 21827) and its use in system software developments ...software development life cycle ( SDLC )? 6. In what ways should process improvement support security in the SDLC ? 1.2 10BPANEL RESOURCES For each... project management, and support practices through the use of the capability maturity models including the CMMI and the Systems Security
Managing the Software Development Process
NASA Technical Reports Server (NTRS)
Lubelczky, Jeffrey T.; Parra, Amy
1999-01-01
The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.
Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)
NASA Technical Reports Server (NTRS)
Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)
1999-01-01
This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
Software Reviews Since Acquisition Reform - The Artifact Perspective
2004-01-01
Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
Assessing and Managing Quality of Information Assurance
2010-11-01
such as firewalls, antivirus scanning tools and mechanisms for user authentication and authorization. Advanced mission-critical systems often...imply increased risk to DoD information systems. The Process and Organizational Maturity (POM) class focuses on the maturity of the software and...include architectural quality. Common Weakness Enumeration (CWE) is a recent example that highlights the connection between software quality and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Air Traffic Control: Immature Software Acquisition Processes Increase FAA System Acquisition Risks
DOT National Transportation Integrated Search
1997-03-01
The General Accounting Office (GAO) at the request of Congress reviewed (1) : the maturity of Federal Aviation Administration's (FAA's) Air Traffic Control : (ATC) modernization software acquisition processes, and (2) the steps/actions : FAA has unde...
Process Tailoring and the Software Capability Maturity Model(sm).
1995-11-01
A Discipline For Software Engineering, Addison-Wesley, 1995; Humphrey . This book summarizes the costs and benefits of a Personal Software Process ( PSP ...1994. [Humphrey95] Humphrey , Watts S . A Discipline For Software Engineering. Reading, MA: Addison-Wesley Publishing Company, 1995. CMUISEI-94-TR-24 43...practiced and institutionalized. 8 CMU/SEI-94-TR-24 . Leveraging mo n o s I cDocument" IRevise & Analyze Organizational LessonsApproach ’"- Define Processes
The Legacy of Space Shuttle Flight Software
NASA Technical Reports Server (NTRS)
Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.
2011-01-01
The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.
Process maturity progress at Motorola Cellular Systems Division
NASA Technical Reports Server (NTRS)
Borgstahl, Ron; Criscione, Mark; Dobson, Kim; Willey, Allan
1994-01-01
We believe that the key success elements are related to our recognition that Software Process Improvement (SPI) can and should be organized, planned, managed, and measured as if it were a project to develop a new process, analogous to a software product. We believe that our process improvements have come as the result of these key elements: use of a rigorous, detailed requirements set (Capability Maturity Model, CMM); use of a robust, yet flexible architecture (IEEE 1074); use of a SPI project, resourced and managed like other work, to produce the specifications and implement them; and development of both internal and external goals, with metrics to support them.
People Capability Maturity Model. SM.
1995-09-01
People Capability Maturity Model SM .^^^^_ -——’ Bill Curtis William E. ] Sally Mille] Hefley r Accesion For t NTIS DTIC...People CMM The P-CMM adapts the architecture and the maturity framework underlying the CMM for use with people-related improvement issues. The CMM...focuses on helping organizations improve their software development processes. By adapting the maturity framework and the CMM architecture
A Brief Study of Software Engineering Professional Continuing Education in DoD Acquisition
2010-04-01
Lifecycle Processes (IEEE 12207 ) (810) 37% 61% 2% Guide to the Software Engineering Body of K l d (SWEBOK) (804) 67% 31% 2% now e ge Software...Engineering-Software Measurement Process ( ISO /IEC 15939) (797) 55% 44% 2% Capability Maturity Model Integration (806) 17% 81% 2% Six Sigma Process...Improvement (804) 7% 91% 1% ISO 9000 Quality Management Systems (803) 10% 89% 1% 28 Conclusions Significant problem areas R i tequ remen s Management Very
Seven Processes that Enable NASA Software Engineering Technologies
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lewicki, Scott; Morgan, Scott
2011-01-01
The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.
Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R
1989-12-01
when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within
Interpreting CMMI High Maturity for Small Organizations
2008-09-01
Stoddard September, 2008 Congreso Internacional en Ingeniería de Software y sus Aplicaciones (International Congress of Software Engineering d...Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Congreso Internacional en Ingeniería de Software y sus Aplicaciones (International Congress of...de Software y sus Aplicaciones (International Congress of Software Engineering and its Applications) Why This Workshop? CMMI Process Performance
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
Process improvement as an investment: Measuring its worth
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Jeletic, Kellyann
1993-01-01
This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.
Leveraging People-Related Maturity Issues for Achieving Higher Maturity and Capability Levels
NASA Astrophysics Data System (ADS)
Buglione, Luigi
During the past 20 years Maturity Models (MM) become a buzzword in the ICT world. Since the initial Crosby's idea in 1979, plenty of models have been created in the Software & Systems Engineering domains, addressing various perspectives. By analyzing the content of the Process Reference Models (PRM) in many of them, it can be noticed that people-related issues have little weight in the appraisals of the capabilities of organizations while in practice they are considered as significant contributors in traditional process and organizational performance appraisals, as stressed instead in well-known Performance Management models such as MBQA, EFQM and BSC. This paper proposes some ways for leveraging people-related maturity issues merging HR practices from several types of maturity models into the organizational Business Process Model (BPM) in order to achieve higher organizational maturity and capability levels.
Contract Management Process Maturity: Analysis of Recent Organizational Assessments
2009-04-22
Airman: The book. (Special Issue, Vol. LI). Washington, DC: Air Force News Agency, Secretary of the Air Force Office of Public Affairs. Yueng , A.K...competence ( Yueng , Ulrich, Nason, & von Glinow, 1999) • Capability maturity models have been successfully used in assessing software management and
A Fast Technology Infusion Model for Aerospace Organizations
NASA Technical Reports Server (NTRS)
Shapiro, Andrew A.; Schone, Harald; Brinza, David E.; Garrett, Henry B.; Feather, Martin S.
2006-01-01
A multi-year Fast Technology Infusion initiative proposes a model for aerospace organizations to improve the cost-effectiveness by which they mature new, in-house developed software and hardware technologies for space mission use. The first year task under the umbrella of this initiative will provide the framework to demonstrate and document the fast infusion process. The viability of this approach will be demonstrated on two technologies developed in prior years with internal Jet Propulsion Laboratory (JPL) funding. One hardware technology and one software technology were selected for maturation within one calendar year or less. The overall objective is to achieve cost and time savings in the qualification of technologies. At the end of the recommended three-year effort, we will have demonstrated for six or more in-house developed technologies a clear path to insertion using a documented process that permits adaptation to a broad range of hardware and software projects.
CMMI Level 5 and the Team Software Process
2007-04-01
could meet the rigors of a CMMI assessment and achieve their group’s goal of Level 5. Watts Humphrey , who is widely acknowledged as the founder of the...Capability Maturity Model® (CMM®) approach to improvement and who later created the Personal Software Process ( PSP )SM and TSP, has noted that one of the...intents of PSP and TSP is to be an operational process enactment of CMM Level 5 processes at the personal and pro- ject levels respectively [1]. CMM
A Measurement & Analysis Training Solution Supporting CMMI & Six Sigma Transition
2004-10-01
product” • Designing an integrated training solution • Illustration(s) © 2004 by Carnegie Mellon University Version 1.0 page 5 Carnegie Mellon S...12207 Score- card EIA 632 ISO 9000 ITIL COBIT PSM GQIM © 2004 by Carnegie Mellon University Version 1.0 page 7 Carnegie Mellon S oftware Engineer ing...several Capability Maturity Models, reflects Crosby’s 5 maturity levels • Focuses on infrastructure and process maturity • Intended for software and
Costs and Benefits of Software Process Improvement
1997-12-01
Washington, DC 20503. 1. AGENCY USE ONLY ( Leave blank) 2. REPORT DATE December 1997 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4...in this field, an organization’s chance for success depends first on having an exceptional manager and an effective development team (PEOPLE...Secondly, it depends on its effective use of TECHNOLOGY, and finally, on its PROCESS maturity. [Ref. 4] In a software organization: PEOPLE refers to
A Matrix Approach to Software Process Definition
NASA Technical Reports Server (NTRS)
Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)
2000-01-01
The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Programming Makes Software; Support Makes Users
NASA Astrophysics Data System (ADS)
Batcheller, A. L.
2010-12-01
Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.
2016-12-01
4 3. Design and Production Maturity...issues (PEO Ships, 2016). 6 3. Design and Production Maturity Despite the fact that the Navy accepted the first ship, USS Zumwalt (DDG 1000...The modern day of AM is a process that uses Computer-Aided Design (CAD) software to create three-dimensional (3D) products by adding one layer on top
Software Process Assurance for Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.
Neural classifier in the estimation process of maturity of selected varieties of apples
NASA Astrophysics Data System (ADS)
Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.
2015-07-01
This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.
A study of software standards used in the avionics industry
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1994-01-01
Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
From Exotic to Mainstream: A 10-year Odyssey from Internet Speed to Boundary Spanning with Scrum
NASA Astrophysics Data System (ADS)
Baskerville, Richard; Pries-Heje, Jan; Madsen, Sabine
Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 we investigate how local software processes interact with global changes in the software development context. In 1999 companies were developing software at high speed in a desperate rush to be first-to-market. In 2001 a new high speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan driven approaches to achieve the benefits of both. The studies reveal a twostage pattern in which dramatic changes in the market causes disruption of established practices, experimentation, and process adaptations followed by consolidation of lessons learnt into a new (and once again mature) software development process. Limitations, implications, and areas for future research are discussed.
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
The Birth, Death, and Resurrection of an SPI Project
NASA Astrophysics Data System (ADS)
Carlsson, Sven; Schönström, Mikael
Commentators on contemporary themes of strategic management and firm competitiveness stress that a firm's competitive advantage flows from its unique knowledge and how it manages knowledge, and for many firms their ability to create, share, exchange, and use knowledge have a major impact on their competitiveness (Nonaka & Teece 2001). In software development, knowledge management (KM) plays an increasingly important role. It has been argued that the KM-field is an important source for creating new perspectives on the software development process (Iivari 2000). Several Software Process Improvement (SPI) approaches stress the importance of managing knowledge and experiences as a way for improving software processes (Ahem et al. 2001). Another SPI-trend is the use of ideas from process management like in the Capability Maturity Model (CMM). Unfortunately, little research on the effects of the use of process management ideas in SPI exists. Given the influx of process management ideas to SPI, the impact of these ideas should be addressed.
Software Process Assurance for Complex Electronics (SPACE)
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
2009-03-01
www.loc.gov/catdir/toc/els031/2002038518.html [ Humphrey 1987] Watts S . Humphrey . Characterizing the Software Process: A Maturity Framework (CMU/SEI...five-stage organizational maturity scale developed by the SEI and first described in an SEI technical report authored by Watts Humphrey in 1987... Humphrey 1987]. In 2007, an additional model was released (CMMI for Acquisition, V1.2), but this technical report focuses only on CMMI-DEV, V1.2. In
Maximizing your Process Improvement ROI through Harmonization
2008-03-01
ISO 12207 ) provide comprehensive guidance on what system and software engineering processes are needed. The frameworks of Six Sigma provide specific...reductions. Their veloci-Q Enterprise integrated system, includes ISO 9001, CMM, P-CMM, TL9000, British Standard 7799, and Six Sigma. They estimate a 30...at their discretion. And, they chose to blend process maturity models and ISO standards to support their objective regarding the establishment of
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
Key Practices of the Capability Maturity Model, Version 1.1
1993-02-01
0-W31 4 Interpreting the CMM ............................................................ 0-35 4.1 Interpreting the Key...Practices............................................. 0-35 4.2 Interpreting the Common Features ..................................... 0-w35 4.2.1...4.2.5 Verifying Implementation ....................................... 0-47 4.3 Interpreting Software Process Definition
Agile: From Software to Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Shirley, Mark; Hobart, Sarah
2017-01-01
To maximize efficiency and flexibility in Mission Operations System (MOS) design, we are evolving principles from agile and lean methods for software, to the complete mission system. This allows for reduced operational risk at reduced cost, and achieves a more effective design through early integration of operations into mission system engineering and flight system design. The core principles are assessment of capability through demonstration, risk reduction through targeted experiments, early test and deployment, and maturation of processes and tools through use.
Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao
2015-09-01
This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.
Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H
2012-07-01
The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.
Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040
NASA Technical Reports Server (NTRS)
Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.
2012-01-01
Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.
Proceedings of the Twenty-Fourth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
2000-01-01
On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.
Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola
2014-12-01
The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Studzinski, J
2017-06-01
The Digital Imaging Adoption Model (DIAM) has been jointly developed by HIMSS Analytics and the European Society of Radiology (ESR). It helps evaluate the maturity of IT-supported processes in medical imaging, particularly in radiology. This eight-stage maturity model drives your organisational, strategic and tactical alignment towards imaging-IT planning. The key audience for the model comprises hospitals with imaging centers, as well as external imaging centers that collaborate with hospitals. The assessment focuses on different dimensions relevant to digital imaging, such as software infrastructure and usage, workflow security, clinical documentation and decision support, data exchange and analytical capabilities. With its standardised approach, it enables regional, national and international benchmarking. All DIAM participants receive a structured report that can be used as a basis for presenting, e.g. budget planning and investment decisions at management level.
Agile Methods for Open Source Safety-Critical Software
Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-01-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545
Agile Methods for Open Source Safety-Critical Software.
Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-08-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Wunderlich, Dana A.; Willoughby, John K.
1992-01-01
New and innovative software technology is presented that provides a cost effective bridge for smoothly transitioning prototype software, in the field of planning and scheduling, into an operational environment. Specifically, this technology mixes the flexibility and human design efficiency of dynamic data typing with the rigor and run-time efficiencies of static data typing. This new technology provides a very valuable tool for conducting the extensive, up-front system prototyping that leads to specifying the correct system and producing a reliable, efficient version that will be operationally effective and will be accepted by the intended users.
A self-referential HOWTO on release engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galassi, Mark C.
Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less
Program Facilitates CMMI Appraisals
NASA Technical Reports Server (NTRS)
Sweetser, Wesley
2005-01-01
A computer program has been written to facilitate appraisals according to the methodology of Capability Maturity Model Integration (CMMI). [CMMI is a government/industry standard, maintained by the Software Engineering Institute at Carnegie Mellon University, for objectively assessing the engineering capability and maturity of an organization (especially, an organization that produces software)]. The program assists in preparation for a CMMI appraisal by providing drop-down lists suggesting required artifacts or evidence. It identifies process areas for which similar evidence is required and includes a copy feature that reduces or eliminates repetitive data entry. It generates reports to show the entire framework for reference, the appraisal artifacts to determine readiness for an appraisal, and lists of interviewees and questions to ask them during the appraisal. During an appraisal, the program provides screens for entering observations and ratings, and reviewing evidence provided thus far. Findings concerning strengths and weaknesses can be exported for use in a report or a graphical presentation. The program generates a chart showing capability level ratings of the organization. A context-sensitive Windows help system enables a novice to use the program and learn about the CMMI appraisal process.
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
Toward a user-driven approach to radiology software solutions: putting the wag back in the dog.
Morgan, Matthew; Mates, Jonathan; Chang, Paul
2006-09-01
The relationship between healthcare providers and the software industry is evolving. In many cases, industry's traditional, market-driven model is failing to meet the increasingly sophisticated and appropriately individualized needs of providers. Advances in both technology infrastructure and development methodologies have set the stage for the transition from a vendor-driven to a more user-driven process of solution engineering. To make this transition, providers must take an active role in the development process and vendors must provide flexible frameworks on which to build. Only then can the provider/vendor relationship mature from a purchaser/supplier to a codesigner/partner model, where true insight and innovation can occur.
Process and information integration via hypermedia
NASA Technical Reports Server (NTRS)
Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.
1990-01-01
Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.
Using Java for distributed computing in the Gaia satellite data processing
NASA Astrophysics Data System (ADS)
O'Mullane, William; Luri, Xavier; Parsons, Paul; Lammers, Uwe; Hoar, John; Hernandez, Jose
2011-10-01
In recent years Java has matured to a stable easy-to-use language with the flexibility of an interpreter (for reflection etc.) but the performance and type checking of a compiled language. When we started using Java for astronomical applications around 1999 they were the first of their kind in astronomy. Now a great deal of astronomy software is written in Java as are many business applications. We discuss the current environment and trends concerning the language and present an actual example of scientific use of Java for high-performance distributed computing: ESA's mission Gaia. The Gaia scanning satellite will perform a galactic census of about 1,000 million objects in our galaxy. The Gaia community has chosen to write its processing software in Java. We explore the manifold reasons for choosing Java for this large science collaboration. Gaia processing is numerically complex but highly distributable, some parts being embarrassingly parallel. We describe the Gaia processing architecture and its realisation in Java. We delve into the astrometric solution which is the most advanced and most complex part of the processing. The Gaia simulator is also written in Java and is the most mature code in the system. This has been successfully running since about 2005 on the supercomputer "Marenostrum" in Barcelona. We relate experiences of using Java on a large shared machine. Finally we discuss Java, including some of its problems, for scientific computing.
NASA Astrophysics Data System (ADS)
Preradović, D. M.; Mićić, Lj S.; Barz, C.
2017-05-01
Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.
Assurance of Complex Electronics. What Path Do We Take?
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Many of the methods used to develop software bare a close resemblance to Complex Electronics (CE) development. CE are now programmed to perform tasks that were previously handled in software, such as communication protocols. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of "software-like" bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications to develop these devices. By using standardized S/W Engineering methods such as checklists, missing requirements and "bugs" can be detected earlier in the development cycle, thus creating a development process for CE that will be easily maintained and configurable based on the device used.
Improving the Agency's Software Acquisition Capability
NASA Technical Reports Server (NTRS)
Hankinson, Allen
2003-01-01
External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).
Benchmarking Software Assurance Implementation
2011-05-18
product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001, ISO 27001 , ISO 2000) – Capability Maturity Models (CMMI...Assurance PRM, RMM, Assurance for CMMI)) – Lifecycle Processes ( ISO /IEEE 15288, ISO /IEEE 12207) – COBIT, ITIL, MS SDL, OSAMM, BSIMM 5 The egg...a.k.a Product Focused Assessments) – SCAP - NIST-SCAP – ISO /OMG W3C – KDM, BPMN, RIF, XMI, RDF – OWASP Top 10 – SANS TOP 25 – Secure Code Check Lists
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.
The Earth System Documentation (ES-DOC) Software Process
NASA Astrophysics Data System (ADS)
Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.
2013-12-01
Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Santiago, R C; Cunha, A R; Júnior, G C; Fernandes, N; Campos, M J S; Costa, L F M; Vitral, R W F; Bolognese, A M
2014-01-01
In the present study, we developed new software for quantitative analysis of cervical vertebrae maturation, and we evaluated its applicability through a multinomial logistic regression model (MLRM). Digitized images of the bodies of the second (C2), third (C3) and fourth (C4) cervical vertebrae were analysed in cephalometric radiographs of 236 subjects (116 boys and 120 girls) by using a software developed for digitized vertebrae analysis. The sample was initially distributed into 11 categories according to the Fishman's skeletal maturity indicators and were then grouped into four stages for quantitative cervical maturational changes (QCMC) analysis (QCMC I, II, III and IV). Seven variables of interest were measured and analysed to identify morphologic alterations of the vertebral bodies in each QCMC category. Statistically significant differences (p < 0.05) were observed among all QCMC categories for the variables analysed. The MLRM used to calculate the probability that an individual belonged to each of the four cervical vertebrae maturation categories was constructed by taking into account gender, chronological age and four variables determined by digitized vertebrae analysis (Ang_C3, MP_C3, MP_C4 and SP_C4). The MLRM presented a predictability of 81.4%. The weighted κ test showed almost perfect agreement (κ = 0.832) between the categories defined initially by the method of Fishman and those allocated by the MLRM. Significant alterations in the morphologies of the C2, C3 and C4 vertebral bodies that were analysed through the digitized vertebrae analysis software occur during the different stages of skeletal maturation. The model that combines the four parameters measured on the vertebral bodies, the age and the gender showed an excellent prediction.
Moving Up the CMMI Capability and Maturity Levels Using Simulation
2008-01-01
Alternative Process Tools, Including NPV and ROI 6 Figure 3: Top-Level View of the Full Life-Cycle Version of the IEEE 12207 PSIM, Including IV&V Layer 19...Figure 4: Screenshot of the Incremental Version Model 19 Figure 5: IEEE 12207 PSIM Showing the Top-Level Life-Cycle Phases 22 Figure 6: IEEE 12207 ...Software Detailed Design for the IEEE 12207 Life- Cycle Process 24 Figure 8: Incremental Life Cycle PSIM Configured for a Specific Project Using SEPG
CMMI Version 1.2 and Beyond Systems and Software Technology Conference
2008-04-29
Presentation • “Extreme Programming (XP), Six Sigma, & CMMI: How They Can Work Together” • “CMMI V1.2 Model Changes” Presentation 5 CMMI Update: V1.2 and...Level 4 Reported Maturity Level 5 Reported Country Number of Appraisals Maturity Level 1 Reported Maturity Level 2 Reported Maturity Level 3...Reported Maturity Level 4 Reported Maturity Level 5 Reported Argentina 26 No Yes Yes Yes Yes Malaysia 29 No Yes Yes No Yes Australia 26 Yes Yes
Onboard shuttle on-line software requirements system: Prototype
NASA Technical Reports Server (NTRS)
Kolkhorst, Barbara; Ogletree, Barry
1989-01-01
The prototype discussed here was developed as proof of a concept for a system which could support high volumes of requirements documents with integrated text and graphics; the solution proposed here could be extended to other projects whose goal is to place paper documents in an electronic system for viewing and printing purposes. The technical problems (such as conversion of documentation between word processors, management of a variety of graphics file formats, and difficulties involved in scanning integrated text and graphics) would be very similar for other systems of this type. Indeed, technological advances in areas such as scanning hardware and software and display terminals insure that some of the problems encountered here will be solved in the near-term (less than five years). Examples of these solvable problems include automated input of integrated text and graphics, errors in the recognition process, and the loss of image information which results from the digitization process. The solution developed for the Online Software Requirements System is modular and allows hardware and software components to be upgraded or replaced as industry solutions mature. The extensive commercial software content allows the NASA customer to apply resources to solving the problem and maintaining documents.
Software Past, Present, and Future: Views from Government, Industry and Academia
NASA Technical Reports Server (NTRS)
Holcomb, Lee; Page, Jerry; Evangelist, Michael
2000-01-01
Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.
Filling the Assurance Gap on Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Many of the methods used to develop software bare a close resemblance to Complex Electronics (CE) development. CE are now programmed to perform tasks that were previously handled by software, such as communication protocols. For example, the James Webb Space Telescope will use Field Programmable Gate Arrays (FPGAs), which can have over a million logic gates, to send telemetry. System-on-chip (SoC) devices, another type of complex electronics, can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, mature software methodologies have been proposed, with slight modifications, to develop these devices. By using standardized S/W Engineering methods such as checklists, missing requirements and bugs can be detected earlier in the development cycle, thus creating a development process for CE that can be easily maintained and configurable based on the device used.
Image Processing Occupancy Sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Image Processing Occupancy Sensor, or IPOS, is a novel sensor technology developed at the National Renewable Energy Laboratory (NREL). The sensor is based on low-cost embedded microprocessors widely used by the smartphone industry and leverages mature open-source computer vision software libraries. Compared to traditional passive infrared and ultrasonic-based motion sensors currently used for occupancy detection, IPOS has shown the potential for improved accuracy and a richer set of feedback signals for occupant-optimized lighting, daylighting, temperature setback, ventilation control, and other occupancy and location-based uses. Unlike traditional passive infrared (PIR) or ultrasonic occupancy sensors, which infer occupancy based only onmore » motion, IPOS uses digital image-based analysis to detect and classify various aspects of occupancy, including the presence of occupants regardless of motion, their number, location, and activity levels of occupants, as well as the illuminance properties of the monitored space. The IPOS software leverages the recent availability of low-cost embedded computing platforms, computer vision software libraries, and camera elements.« less
Extending Cross-Generational Knowledge Flow Research in Edge Organizations
2008-06-01
letting Protégé generate the basic user interface, and then gradually write widgets and plug-ins to customize its look-and- feel and behavior . 4 3.0...2007a) focused on cross-generational knowledge flows in edge organizations. We found that cross- generational biases affect tacit knowledge transfer...the software engineering field, many matured methodologies already exist, such as Rational Unified Process (Hunt, 2003) or Extreme Programming (Beck
NASA Software Engineering Benchmarking Study
NASA Technical Reports Server (NTRS)
Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.
2013-01-01
To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5.onsolidate, collect and, if needed, develop common processes principles and other assets across the Agency in order to provide more consistency in software development and acquisition practices and to reduce the overall cost of maintaining or increasing current NASA CMMI maturity levels. 6. Provide additional support for small projects that includes: (a) guidance for appropriate tailoring of requirements for small projects, (b) availability of suitable tools, including support tool set-up and training, and (c) training for small project personnel, assurance personnel and technical authorities on the acceptable options for tailoring requirements and performing assurance on small projects. 7. Develop software training classes for the more experienced software engineers using on-line training, videos, or small separate modules of training that can be accommodated as needed throughout a project. 8. Create guidelines to structure non-classroom training opportunities such as mentoring, peer reviews, lessons learned sessions, and on-the-job training. 9. Develop a set of predictive software defect data and a process for assessing software testing metric data against it. 10. Assess Agency-wide licenses for commonly used software tools. 11. Fill the knowledge gap in common software engineering practices for new hires and co-ops.12. Work through the Science, Technology, Engineering and Mathematics (STEM) program with universities in strengthening education in the use of common software engineering practices and standards. 13. Follow up this benchmark study with a deeper look into what both internal and external organizations perceive as the scope of software assurance, the value they expect to obtain from it, and the shortcomings they experience in the current practice. 14. Continue interactions with external software engineering environment through collaborations, knowledge sharing, and benchmarking.
Software Engineering Education: Some Important Dimensions
ERIC Educational Resources Information Center
Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan
2007-01-01
Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…
Cunha, A R; Júnior, G C; Fernandes, N; Campos, M J S; Costa, L F M; Vitral, R W F; Bolognese, A M
2014-01-01
Objectives: In the present study, we developed new software for quantitative analysis of cervical vertebrae maturation, and we evaluated its applicability through a multinomial logistic regression model (MLRM). Methods: Digitized images of the bodies of the second (C2), third (C3) and fourth (C4) cervical vertebrae were analysed in cephalometric radiographs of 236 subjects (116 boys and 120 girls) by using a software developed for digitized vertebrae analysis. The sample was initially distributed into 11 categories according to the Fishman's skeletal maturity indicators and were then grouped into four stages for quantitative cervical maturational changes (QCMC) analysis (QCMC I, II, III and IV). Seven variables of interest were measured and analysed to identify morphologic alterations of the vertebral bodies in each QCMC category. Results: Statistically significant differences (p < 0.05) were observed among all QCMC categories for the variables analysed. The MLRM used to calculate the probability that an individual belonged to each of the four cervical vertebrae maturation categories was constructed by taking into account gender, chronological age and four variables determined by digitized vertebrae analysis (Ang_C3, MP_C3, MP_C4 and SP_C4). The MLRM presented a predictability of 81.4%. The weighted κ test showed almost perfect agreement (κ = 0.832) between the categories defined initially by the method of Fishman and those allocated by the MLRM. Conclusions: Significant alterations in the morphologies of the C2, C3 and C4 vertebral bodies that were analysed through the digitized vertebrae analysis software occur during the different stages of skeletal maturation. The model that combines the four parameters measured on the vertebral bodies, the age and the gender showed an excellent prediction. PMID:24319125
Application of GIS Rapid Mapping Technology in Disaster Monitoring
NASA Astrophysics Data System (ADS)
Wang, Z.; Tu, J.; Liu, G.; Zhao, Q.
2018-04-01
With the rapid development of GIS and RS technology, especially in recent years, GIS technology and its software functions have been increasingly mature and enhanced. And with the rapid development of mathematical statistical tools for spatial modeling and simulation, has promoted the widespread application and popularization of quantization in the field of geology. Based on the investigation of field disaster and the construction of spatial database, this paper uses remote sensing image, DEM and GIS technology to obtain the data information of disaster vulnerability analysis, and makes use of the information model to carry out disaster risk assessment mapping.Using ArcGIS software and its spatial data modeling method, the basic data information of the disaster risk mapping process was acquired and processed, and the spatial data simulation tool was used to map the disaster rapidly.
Process Improvement in a Radically Changing Organization
NASA Technical Reports Server (NTRS)
Varga, Denise M.; Wilson, Barbara M.
2007-01-01
This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.
Software engineering from a Langley perspective
NASA Technical Reports Server (NTRS)
Voigt, Susan
1994-01-01
A brief introduction to software engineering is presented. The talk is divided into four sections beginning with the question 'What is software engineering', followed by a brief history of the progression of software engineering at the Langley Research Center in the context of an expanding computing environment. Several basic concepts and terms are introduced, including software development life cycles and maturity levels. Finally, comments are offered on what software engineering means for the Langley Research Center and where to find more information on the subject.
Wang, Xiao-Jing; Wang, Xiao-Xing; Wang, Ya-Jun; Wang, Xi-Zhong; He, Guang-Xin; Chen, Hong-Wei; Fei, Li-Song
2002-09-01
Activin, which is included in the transforming growth factor-beta (TGF beta) superfamily of proteins and receptors, is known to have broad-ranging effects in the creatures. The mature peptide of beta A subunit of this gene, one of the most highly conserved sequence, can elevate the basal secretion of follicle-stimulating hormone (FSH) in the pituitary and FSH is pivotal to organism's reproduction. Reproduction block is one of the main reasons which cause giant panda to extinct. The sequence of Activin beta A subunit gene mature peptides has been successfully amplified from giant panda, red panda and malayan sun bear's genomic DNA by using polymerase chain reaction (PCR) with a pair of degenerate primers. The PCR products were cloned into the vector pBlueScript+ of Esherichia coli. Sequence analysis of Activin beta A subunit gene mature peptides shows that the length of this gene segment is the same (359 bp) and there is no intron in all three species. The sequence encodes a peptide of 119 amino acid residues. The homology comparison demonstrates 93.9% DNA homology and 99% homology in amino acid among these three species. Both GenBank blast search result and restriction enzyme map reveal that the sequences of Activin beta A subunit gene mature peptides of different species are highly conserved during the evolution process. Phylogeny analysis is performed with PHYLIP software package. A consistent phylogeny tree has been drawn with three different methods. The software analysis outcome accords with the academic view that giant panda has a closer relationship to the malayan sun bear than the red panda. Giant panda should be grouped into the bear family (Uersidae) with the malayan sun bear. As to the red panda, it would be better that this animal be grouped into the unique family (red panda family) because of great difference between the red panda and the bears (Uersidae).
NASA Experience with CMM and CMMI
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
This slide presentation reviews the experience NASA has had in using Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI). In particular this presentation reviews the agency's experience within the software engineering discipline and the lessons learned and key impacts from using CMMI.
Applying Formal Methods to NASA Projects: Transition from Research to Practice
NASA Technical Reports Server (NTRS)
Othon, Bill
2009-01-01
NASA project managers attempt to manage risk by relying on mature, well-understood process and technology when designing spacecraft. In the case of crewed systems, the margin for error is even tighter and leads to risk aversion. But as we look to future missions to the Moon and Mars, the complexity of the systems will increase as the spacecraft and crew work together with less reliance on Earth-based support. NASA will be forced to look for new ways to do business. Formal methods technologies can help NASA develop complex but cost effective spacecraft in many domains, including requirements and design, software development and inspection, and verification and validation of vehicle subsystems. To realize these gains, the technologies must be matured and field-tested so that they are proven when needed. During this discussion, current activities used to evaluate FM technologies for Orion spacecraft design will be reviewed. Also, suggestions will be made to demonstrate value to current designers, and mature the technology for eventual use in safety-critical NASA missions.
NASA Astrophysics Data System (ADS)
Kruba, Steve; Meyer, Jim
Business process management suites (BPMS's) represent one of the fastest growing segments in the software industry as organizations automate their key business processes. As this market matures, it is interesting to compare it to Chris Anderson's 'Long Tail.' Although the 2004 "Long Tail" article in Wired magazine was primarily about the media and entertainment industries, it has since been applied (and perhaps misapplied) to other markets. Analysts describe a "Tail of BPM" market that is, perhaps, several times larger than the traditional BPMS product market. This paper will draw comparisons between the concepts in Anderson's article (and subsequent book) and the BPM solutions market.
The People Capability Maturity Model
ERIC Educational Resources Information Center
Wademan, Mark R.; Spuches, Charles M.; Doughty, Philip L.
2007-01-01
The People Capability Maturity Model[R] (People CMM[R]) advocates a staged approach to organizational change. Developed by the Carnegie Mellon University Software Engineering Institute, this model seeks to bring discipline to the people side of management by promoting a structured, repeatable, and predictable approach for improving an…
Preliminary Design of an Autonomous Amphibious System
2016-09-01
changing vehicle dynamics will require innovative new autonomy algorithms. The developed software architecture, drive-by- wire kit, and supporting...COMMUNICATIONS ARCHITECTURE .................................................12 3.3 DRIVE-BY- WIRE DESIGN...SOFTWARE MATURATION PLANS ......................................................17 4.2 DRIVE-BY- WIRE PLANNED REFINEMENT
The Effectiveness of Software Project Management Practices: A Quantitative Measurement
2011-03-01
Assessment (SPMMA) model ( Ramli , 2007). The purpose of the SPMMA was to help a company measure the strength and weaknesses of its software project...Practices,” Fuazi and Ramli presented a model to assess software project management practices using their Software Project Management Maturity...Analysis The SPMMA was carried out on one mid-size Information Technology (IT) Company . Based on the questionnaire responses, interviews and discussions
Improving the Effectiveness of Program Managers
2006-05-03
Improving the Effectiveness of Program Managers Systems and Software Technology Conference Salt Lake City, Utah May 3, 2006 Presented by GAO’s...Companies’ best practices Motorola Caterpillar Toyota FedEx NCR Teradata Boeing Hughes Space and Communications Disciplined software and management...and total ownership costs Collection of metrics data to improve software reliability Technology readiness levels and design maturity Statistical
Using Pilots to Assess the Value and Approach of CMMI Implementation
NASA Technical Reports Server (NTRS)
Godfrey, Sara; Andary, James; Rosenberg, Linda
2002-01-01
At Goddard Space Flight Center (GSFC), we have chosen to use Capability Maturity Model Integrated (CMMI) to guide our process improvement program. Projects at GSFC consist of complex systems of software and hardware that control satellites, operate ground systems, run instruments, manage databases and data and support scientific research. It is a challenge to launch a process improvement program that encompasses our diverse systems, yet is manageable in terms of cost effectiveness. In order to establish the best approach for improvement, our process improvement effort was divided into three phases: 1) Pilot projects; 2) Staged implementation; and 3) Sustainment and continual improvement. During Phase 1 the focus of the activities was on a baselining process, using pre-appraisals in order to get a baseline for making a better cost and effort estimate for the improvement effort. Pilot pre-appraisals were conducted from different perspectives so different approaches for process implementation could be evaluated. Phase 1 also concentrated on establishing an improvement infrastructure and training of the improvement teams. At the time of this paper, three pilot appraisals have been completed. Our initial appraisal was performed in a flight software area, considering the flight software organization as the organization. The second appraisal was done from a project perspective, focusing on systems engineering and acquisition, and using the organization as GSFC. The final appraisal was in a ground support software area, again using GSFC as the organization. This paper will present our initial approach, lessons learned from all three pilots and the changes in our approach based on the lessons learned.
A Mechanism of Modeling and Verification for SaaS Customization Based on TLA
NASA Astrophysics Data System (ADS)
Luan, Shuai; Shi, Yuliang; Wang, Haiyang
With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.
2010-03-01
of sub-routines Thermal history • Abaqus FEM engine mature applied within ABAQUS Residual stress & Distortion • Unknown maturity for HTC • Focused...investment. The committee’s ICME vision is comprehensive, expansive , and involves the entire materials community. The scope of this white paper is...Software • Continuum FEM for fluid flow, heat Mold Fill • FEM implementation mature flow and stress analysis Thermal & mushy zone history • Needs
Management Guidelines for Database Developers' Teams in Software Development Projects
NASA Astrophysics Data System (ADS)
Rusu, Lazar; Lin, Yifeng; Hodosi, Georg
Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
NASA Astrophysics Data System (ADS)
Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M.
2018-05-01
In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC) has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome "La Sapienza"; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome "La Sapienza" and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.
Online catalog access and distribution of remotely sensed information
NASA Astrophysics Data System (ADS)
Lutton, Stephen M.
1997-09-01
Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.
Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)
NASA Technical Reports Server (NTRS)
Basinger, Scott A.
2012-01-01
This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.
Development of an aerial counting system in oil palm plantations
NASA Astrophysics Data System (ADS)
Zulyma Miserque Castillo, Jhany; Laverde Diaz, Rubbermaid; Rueda Guzmán, Claudia Leonor
2016-07-01
This paper proposes the development of a counting aerial system capable of capturing, process and analyzing images of an oil palm plantation to register the number of cultivated palms. It begins with a study of the available UAV technologies to define the most appropriate model according to the project needs. As result, a DJI Phantom 2 Vision+ is used to capture pictures that are processed by a photogrammetry software to create orthomosaics from the areas of interest, which are handled by the developed software to calculate the number of palms contained in them. The implemented algorithm uses a sliding window technique in image pyramids to generate candidate windows, an LBP descriptor to model the texture of the picture, a logistic regression model to classify the windows and a non-maximum suppression algorithm to refine the decision. The system was tested in different images than the ones used for training and for establishing the set point. As result, the system showed a 95.34% detection rate with a 97.83% precision in mature palms and a 79.26% detection rate with a 97.53% precision in young palms giving an FI score of 0.97 for mature palms and 0.87 for the small ones. The results are satisfactory getting the census and high-quality images from which is possible to get more information from the area of interest. All this, achieved through a low-cost system capable of work even in cloudy conditions.
Software quality for 1997 - what works and what doesn`t?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, C.
1997-11-01
This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.
The Software Maturity Matrix: A Software Performance Metric
2003-01-28
are for Managing n Use Them! n Unused measurements have the same value as last night’s unused hotel room or an empty airline seat. n Be Prepared to...standard measurements are implicit n Organization standard verification is implicit n Organization standard SMM training can be the basis of an
1987-06-01
described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to
Design, Qualification, and On Orbit Performance of the CALIPSO Aerosol Lidar Transmitter
NASA Technical Reports Server (NTRS)
Hovis, Floyd E.; Witt, Greg; Sullivan, Edward T.; Le, Khoa; Weimer, Carl; Applegate, Jeff; Luck, William S., Jr.; Verhapen, Ron; Cisewski, Michael S.
2007-01-01
The laser transmitter for the CALIPSO aerosol lidar mission has been operating on orbit as planned since June 2006. This document discusses the optical and laser system design and qualification process that led to this success. Space-qualifiable laser design guidelines included the use of mature laser technologies, the use of alignment sensitive resonator designs, the development and practice of stringent contamination control procedures, the operation of all optical components at appropriately derated levels, and the proper budgeting for the space-qualification of the electronics and software.
2010-01-01
Soporte de Modelos Sistémicos: Aplicación al Sector de Desarrollo de Software de Argentina,” Tesis de PhD, Universidad Tecnológica Nacional-Facultad...with New Results 31 2.3 Other Simulation Approaches 37 Conceptual Planning , Execution, and Operation of Combat Fire Support Effectiveness: A...Figure 29: Functional Structure of Multiple Regression Model 80 Figure 30: TSP Quality Plan One 85 Figure 31: TSP Quality Plan Two 85 Figure
PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah
2009-12-01
In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less
The coming commoditization of processes.
Davenport, Thomas H
2005-06-01
Despite the much-ballyhooed increase in outsourcing, most companies are in do-it-yourself mode for the bulk of their processes, in large part because there's no way to compare outside organizations' capabilities with those of internal functions. Given the lack of comparability, it's almost surprising that anyone outsources today. But it's not surprising that cost is by far companies' primary criterion for evaluating outsourcers or that many companies are dissatisfied with their outsourcing relationships. A new world is coming, says the author, and it will lead to dramatic changes in the shape and structure of corporations. A broad set of process standards will soon make it easy to determine whether a business capability can be improved by outsourcing it. Such standards will also help businesses compare service providers and evaluate the costs versus the benefits of outsourcing. Eventually these costs and benefits will be so visible to buyers that outsourced processes will become a commodity, and prices will drop significantly. The low costs and low risk of outsourcing will accelerate the flow of jobs offshore, force companies to reassess their strategies, and change the basis of competition. The speed with which some businesses have already adopted process standards suggests that many previously unscrutinized areas are ripe for change. In the field of technology, for instance, the Carnegie Mellon Software Engineering Institute has developed a global standard for software development processes, called the Capability Maturity Model (CMM). For companies that don't have process standards in place, it makes sense for them to create standards by working with customers, competitors, software providers, businesses that processes may be outsourced to, and objective researchers and standard-setters. Setting standards is likely to lead to the improvement of both internal and outsourced processes.
Using color management in color document processing
NASA Astrophysics Data System (ADS)
Nehab, Smadar
1995-04-01
Color Management Systems have been used for several years in Desktop Publishing (DTP) environments. While this development hasn't matured yet, we are already experiencing the next generation of the color imaging revolution-Device Independent Color for the small office/home office (SOHO) environment. Though there are still open technical issues with device independent color matching, they are not the focal point of this paper. This paper discusses two new and crucial aspects in using color management in color document processing: the management of color objects and their associated color rendering methods; a proposal for a precedence order and handshaking protocol among the various software components involved in color document processing. As color peripherals become affordable to the SOHO market, color management also becomes a prerequisite for common document authoring applications such as word processors. The first color management solutions were oriented towards DTP environments whose requirements were largely different. For example, DTP documents are image-centric, as opposed to SOHO documents that are text and charts centric. To achieve optimal reproduction on low-cost SOHO peripherals, it is critical that different color rendering methods are used for the different document object types. The first challenge in using color management of color document processing is the association of rendering methods with object types. As a result of an evolutionary process, color matching solutions are now available as application software, as driver embedded software and as operating system extensions. Consequently, document processing faces a new challenge, the correct selection of the color matching solution while avoiding duplicate color corrections.
Yeo, Sang Seok; Jang, Sung Ho; Son, Su Min
2014-01-01
Background and Purpose: The corticospinal tract (CST) and corticoreticular pathway (CRP) are known to be important neural tracts for motor development. However, little is known about the difference in maturation of the CST and CRP. In this study, using diffusion tensor imaging (DTI), we investigated maturation of the CST and CRP in typically developed children and normal healthy adults. Methods: We recruited 75 normal healthy subjects for this study. DTI was performed using 1.5-T, and the CST and CRP were reconstructed using DTI-Studio software. Values of fractional anisotropy (FA) and fiber volume (FV) of the CST and CRP were measured. Results: In the current study, the threshold points for CST and CRP maturation were different in normal brain development. Change in FA value of the CST showed a steep increase until 7 years of age and then a gradual increase until adulthood, however, the CRP showed a steep increase only until 2 years of age and then a very gradual increase or plateau until adulthood. In terms of FV, the CST showed a steep increase until 12 years and then a gradual increase until adulthood, in contrast, the CRP showed gradual increase of FV across whole age range (0–25 years). Conclusion: The difference in maturation process between CST and CRP appears to be related to different periods of fine and gross motor development. This radiologic information can provide a scientific basis for understanding development in motor function. PMID:25309378
NASA Astrophysics Data System (ADS)
Ott, S.
2010-12-01
The Herschel Space Observatory is the fourth cornerstone mission in the ESA science programme and performs photometry and spectroscopy in the 55 - 672 micron range. The development of the Herschel Data Processing System started in 2002 to support the data analysis for Instrument Level Tests. The Herschel Data Processing System was used for the pre-flight characterisation of the instruments, and during various ground segment test campaigns. Following the successful launch of Herschel 14th of May 2009 the Herschel Data Processing System demonstrated its maturity when the first PACS preview observation of M51 was processed within 30 minutes of reception of the first science data after launch. Also the first HIFI observations on DR21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. A fast turn-around cycle between data retrieval and the production of science-ready products was demonstrated during the Herschel Science Demonstration Phase Initial Results Workshop held 7 months after launch, which is a clear proof that the system has reached a good level of maturity. We will summarise the scope, the management and development methodology of the Herschel Data Processing system, present some key software elements and give an overview about the current status and future development milestones.
Why and how Mastering an Incremental and Iterative Software Development Process
NASA Astrophysics Data System (ADS)
Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe
2004-06-01
One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.
Health management and controls for Earth-to-orbit propulsion systems
NASA Astrophysics Data System (ADS)
Bickford, R. L.
1995-03-01
Avionics and health management technologies increase the safety and reliability while decreasing the overall cost for Earth-to-orbit (ETO) propulsion systems. New ETO propulsion systems will depend on highly reliable fault tolerant flight avionics, advanced sensing systems and artificial intelligence aided software to ensure critical control, safety and maintenance requirements are met in a cost effective manner. Propulsion avionics consist of the engine controller, actuators, sensors, software and ground support elements. In addition to control and safety functions, these elements perform system monitoring for health management. Health management is enhanced by advanced sensing systems and algorithms which provide automated fault detection and enable adaptive control and/or maintenance approaches. Aerojet is developing advanced fault tolerant rocket engine controllers which provide very high levels of reliability. Smart sensors and software systems which significantly enhance fault coverage and enable automated operations are also under development. Smart sensing systems, such as flight capable plume spectrometers, have reached maturity in ground-based applications and are suitable for bridging to flight. Software to detect failed sensors has reached similar maturity. This paper will discuss fault detection and isolation for advanced rocket engine controllers as well as examples of advanced sensing systems and software which significantly improve component failure detection for engine system safety and health management.
NASA Astrophysics Data System (ADS)
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-05-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Introducing Risk Management Techniques Within Project Based Software Engineering Courses
NASA Astrophysics Data System (ADS)
Port, Daniel; Boehm, Barry
2002-03-01
In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.
Drawert, Brian; Engblom, Stefan; Hellander, Andreas
2012-06-22
Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, C.
1997-11-01
For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less
NASA Astrophysics Data System (ADS)
Fu, L.; West, P.; Zednik, S.; Fox, P. A.
2013-12-01
For simple portals such as vocabulary based services, which contain small amounts of data and require only hyper-textual representation, it is often an overkill to adopt the whole software stack of database, middleware and front end, or to use a general Web development framework as the starting point of development. Directly combining open source software is a much more favorable approach. However, our experience with the Coastal and Marine Spatial Planning Vocabulary (CMSPV) service portal shows that there are still issues such as system configuration and accommodating a new team member that need to be handled carefully. In this contribution, we share our experience in the context of the CMSPV portal, and focus on the tools and mechanisms we've developed to ease the configuration job and the incorporation process of new project members. We discuss the configuration issues that arise when we don't have complete control over how the software in use is configured and need to follow existing configuration styles that may not be well documented, especially when multiple pieces of such software need to work together as a combined system. As for the CMSPV portal, it is built on two pieces of open source software that are still under rapid development: a Fuseki data server and Epimorphics Linked Data API (ELDA) front end. Both lack mature documentation and tutorials. We developed comparison and labeling tools to ease the problem of system configuration. Another problem that slowed down the project is that project members came and went during the development process, so new members needed to start with a partially configured system and incomplete documentation left by old members. We developed documentation/tutorial maintenance mechanisms based on our comparison and labeling tools to make it easier for the new members to be incorporated into the project. These tools and mechanisms also provided benefit to other projects that reused the software components from the CMSPV system.
Urschler, Martin; Grassegger, Sabine; Štern, Darko
2015-01-01
Age estimation of individuals is important in human biology and has various medical and forensic applications. Recent interest in MR-based methods aims to investigate alternatives for established methods involving ionising radiation. Automatic, software-based methods additionally promise improved estimation objectivity. To investigate how informative automatically selected image features are regarding their ability to discriminate age, by exploring a recently proposed software-based age estimation method for MR images of the left hand and wrist. One hundred and two MR datasets of left hand images are used to evaluate age estimation performance, consisting of bone and epiphyseal gap volume localisation, computation of one age regression model per bone mapping image features to age and fusion of individual bone age predictions to a final age estimate. Quantitative results of the software-based method show an age estimation performance with a mean absolute difference of 0.85 years (SD = 0.58 years) to chronological age, as determined by a cross-validation experiment. Qualitatively, it is demonstrated how feature selection works and which image features of skeletal maturation are automatically chosen to model the non-linear regression function. Feasibility of automatic age estimation based on MRI data is shown and selected image features are found to be informative for describing anatomical changes during physical maturation in male adolescents.
NASA Technical Reports Server (NTRS)
Feather, M. S.
2002-01-01
Infusing IT technology is a perennial challenge. The Technology Infusion and Maturity Assessment approach of Cornford & Hicks is shown applied to an example of IT infusion: moedl-based V&V of spacecraft software.
Software organization for a prolog-based prototyping system for machine vision
NASA Astrophysics Data System (ADS)
Jones, Andrew C.; Hack, Ralf; Batchelor, Bruce G.
1996-11-01
We describe PIP (prolog image processing)--a prototype system for interactive image processing using Prolog, implemented on an Apple Macintosh computer. PIP is the latest in a series of products that the third author has been involved in the implementation of, under the collective title Prolog+. PIP differs from our previous systems in two particularly important respects. The first is that whereas we previously required dedicated image processing hardware, the present system implements image processing routines using software. The second difference is that our present system is hierarchical in structure, where the top level of the hierarchy emulates Prolog+, but there is a flexible infrastructure which supports more sophisticated image manipulation which we will be able to exploit in due course . We discuss the impact of the Apple Macintosh operating system upon the implementation of the image processing functions, and the interface between these functions and the Prolog system. We also explain how the existing set of Prolog+ commands has been implemented. PIP is now nearing maturity, and we will make a version of it generally available in the near future. However, although the represent version of PIP constitutes a complete image processing tool, there are a number of ways in which we are intending to enhance future versions, with a view to added flexibility and efficiency: we discuss these ideas briefly near the end of the present paper.
NASA Astrophysics Data System (ADS)
Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon
2005-06-01
With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.
BarraCUDA - a fast short read sequence aligner using graphics processing units
2012-01-01
Background With the maturation of next-generation DNA sequencing (NGS) technologies, the throughput of DNA sequencing reads has soared to over 600 gigabases from a single instrument run. General purpose computing on graphics processing units (GPGPU), extracts the computing power from hundreds of parallel stream processors within graphics processing cores and provides a cost-effective and energy efficient alternative to traditional high-performance computing (HPC) clusters. In this article, we describe the implementation of BarraCUDA, a GPGPU sequence alignment software that is based on BWA, to accelerate the alignment of sequencing reads generated by these instruments to a reference DNA sequence. Findings Using the NVIDIA Compute Unified Device Architecture (CUDA) software development environment, we ported the most computational-intensive alignment component of BWA to GPU to take advantage of the massive parallelism. As a result, BarraCUDA offers a magnitude of performance boost in alignment throughput when compared to a CPU core while delivering the same level of alignment fidelity. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the alignment throughput. Conclusions BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the alignment of millions of sequencing reads generated by NGS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology. BarraCUDA is currently available from http://seqbarracuda.sf.net PMID:22244497
Dynamics of miRNA biogenesis and nuclear transport.
Kotipalli, Aneesh; Gutti, Ravikumar; Mitra, Chanchal K
2016-12-01
MicroRNAs (miRNAs) are short noncoding RNA sequences ~22 nucleotides in length that play an important role in gene regulation-transcription and translation. The processing of these miRNAs takes place in both the nucleus and the cytoplasm while the final maturation occurs in the cytoplasm. Some mature miRNAs with nuclear localisation signals (NLS) are transported back to the nucleus and some remain in the cytoplasm. The functional roles of these miRNAs are seen in both the nucleus and the cytoplasm. In the nucleus, miRNAs regulate gene expression by binding to the targeted promoter sequences and affect either the transcriptional gene silencing (TGS) or transcriptional gene activation (TGA). In the cytoplasm, targeted mRNAs are translationally repressed or cleaved based on the complementarity between the two sequences at the seed region of miRNA and mRNA. The selective transport of mature miRNAs to the nucleus follows the classical nuclear import mechanism. The classical nuclear import mechanism is a highly regulated process, involving exportins and importins. The nuclear pore complex (NPC) regulates all these transport events like a gate keeper. The half-life of miRNAs is rather low, so within a short time miRNAs perform their function. Temporal studies of miRNA biogenesis are, therefore, useful. We have carried out simulation studies for important miRNA biogenesis steps and also classical nuclear import mechanism using ordinary differential equation (ODE) solver in the Octave software.
Dynamics of miRNA biogenesis and nuclear transport.
Kotipalli, Aneesh; Gutti, Ravikumar; Mitra, Chanchal K
2016-12-22
MicroRNAs (miRNAs) are short noncoding RNA sequences ~22 nucleotides in length that play an important role in gene regulation-transcription and translation. The processing of these miRNAs takes place in both the nucleus and the cytoplasm while the final maturation occurs in the cytoplasm. Some mature miRNAs with nuclear localisation signals (NLS) are transported back to the nucleus and some remain in the cytoplasm. The functional roles of these miRNAs are seen in both the nucleus and the cytoplasm. In the nucleus, miRNAs regulate gene expression by binding to the targeted promoter sequences and affect either the transcriptional gene silencing (TGS) or transcriptional gene activation (TGA). In the cytoplasm, targeted mRNAs are translationally repressed or cleaved based on the complementarity between the two sequences at the seed region of miRNA and mRNA. The selective transport of mature miRNAs to the nucleus follows the classical nuclear import mechanism. The classical nuclear import mechanism is a highly regulated process, involving exportins and importins. The nuclear pore complex (NPC) regulates all these transport events like a gate keeper. The half-life of miRNAs is rather low, so within a short time miRNAs perform their function. Temporal studies of miRNA biogenesis are, therefore, useful. We have carried out simulation studies for important miRNA biogenesis steps and also classical nuclear import mechanism using ordinary differential equation (ODE) solver in the Octave software.
Space Missions: Long Term Preservation of IDL-based Software using GDL
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Arabas, S.; Lenoir, M.; Noreskal, L.; Erard, S.
2012-09-01
GNU Data Language (GDL) is a free software clone of IDL, an interactive language widely used in Astronomy and space missions since decades. Proprietary status, license restrictions, price, sustainability and continuity of support for particular platforms are recurrent concerns in the Astronomy community, especially concerning space missions, which require long-term support. In this paper, we describe the key features of GDL and the main achievements from recent development work. We illustrate the maturity of GDL by presenting two examples of application: reading spectral cubes in PDS format and use of the HEALPix library. These examples support the main argument of the paper: that GDL has reached a level of maturity and usability ensuring long term preservation of analysis capabilities for numerous ground experiments and spaces missions based on IDL.
2012-11-01
Tradeoff Analysis Method; ATAM, Capability Maturity Model , Capability Maturity Modeling , Carnegie Mellon, CERT, CERT Coordination Center, CMM, CMMI...Hermansen, Product Design, Sphere of Influence (https://www.SphereOfInfluence.com) Joel McAteer, Information Assurance Manager, Modeling ...use of them does introduce some challenges related to delivering software features rapidly and/or in- crementally . • Challenges with respect to
Indico central - events organisation, ergonomics and collaboration tools integration
NASA Astrophysics Data System (ADS)
Benito Gonzélez López, José; Ferreira, José Pedro; Baron, Thomas
2010-04-01
While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)
Cervical vertebral and dental maturity in Turkish subjects.
Başaran, Güvenç; Ozer, Törün; Hamamci, Nihal
2007-04-01
The aim of this study was to investigate the relationships between the stages of calcification of teeth and the cervical vertebral maturity stages in Turkish subjects. A retrospective cross-sectional study was designed. The final study population consisted of 590 Turkish subjects. Statistical analysis of the data was performed with computer software. Spearman rank order correlation coefficients were used to assess the relationship between cervical vertebral and dental maturation. For a better understanding of the relationship between cervical vertebral maturation indexes and dental age, percentage distributions of the studied teeth were also calculated. Strict correlations were found between dental and cervical vertebral maturation of Turkish subjects. For males, the sequence from lowest to the highest was third molar, central incisor, canine, first premolar, second premolar, first molar, and second molar. For females, the sequence from lowest to the highest was third molar, canine, second premolar, first premolar, central incisor, first molar, and second molar. Dental maturation stages can be used as a reliable indicator of facial growth.
Towards a mature measurement environment: Creating a software engineering research environment
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1990-01-01
Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development
NASA Astrophysics Data System (ADS)
Pikkarainen, Minna; Korkala, Mikko
Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.
Results of a Formal Methods Demonstration Project
NASA Technical Reports Server (NTRS)
Kelly, J.; Covington, R.; Hamilton, D.
1994-01-01
This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.
Liu, Jin-Na; Xie, Xiao-Liang; Yang, Tai-Xin; Zhang, Cun-Li; Jia, Dong-Sheng; Liu, Ming; Wen, Chun-Xiu
2014-04-01
To study the different mature stages and the best processing methods on the quality of Trichosanthes kirilowii seeds. The content of 3,29-dibenzoyl rarounitriol in Trichosanthes kirilowii seeds was determined by HPLC. The sample of different mature stages such as immature, near mature and fully mature and processed by different methods were studied. Fully mature Trichosanthes kirilowii seeds were better than the immatured, and the best processing method was dried under 60degrees C, the content of 3,29-dibenzoyl rarounitriol reached up to 131.63microlg/mL. Different processing methods and different mature stages had a significant influence on the quality of Trichosanthes kirilowii seeds.
A qualitative assessment of Toxoplasma gondii risk in ready-to-eat smallgoods processing.
Mie, Tanya; Pointon, Andrew M; Hamilton, David R; Kiermeier, Andreas
2008-07-01
Toxoplasma gondii is one of the most common parasitic infections of humans and other warm-blooded animals. In most adults, it does not cause serious illness, but severe disease may result from infection in fetuses and immunocompromised people. Consumption of raw or undercooked meats has consistently been identified as an important source of exposure to T. gondii. Several studies indicate the potential failure to inactivate T. gondii in the processes of cured meat products, This article presents a qualitative risk-based assessment of the processing of ready-to-eat smallgoods, which include cooked or uncooked fermented meat, pâté, dried meat, slow cured meat, luncheon meat, and cooked muscle meat including ham and roast beef. The raw meat ingredients are rated with respect to their likelihood of containing T. gondii cysts and an adjustment is made based on whether all the meat from a particular source is frozen. Next, the effectiveness of common processing steps to inactivate T. gondii cysts is assessed, including addition of spices, nitrates, nitrites and salt, use of fermentation, smoking and heat treatment, and the time and temperature during maturation. It is concluded that processing steps that may be effective in the inactivation of T. gondii cysts include freezing, heat treatment, and cooking, and the interaction between salt concentration, maturation time, and temperature. The assessment is illustrated using a Microsoft Excel-based software tool that was developed to facilitate the easy assessment of four hypothetical smallgoods products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tugurlan, Maria C.; Kirkham, Harold; Chassin, David P.
Abstract Budget and schedule overruns in product development due to the use of immature technologies constitute an important matter for program managers. Moreover, unexpected lack of technology maturity is also a problem for buyers. Both sides of the situation would benefit from an unbiased measure of technology maturity. This paper presents the use of a software maturity metric called Technology Readiness Level (TRL), in the milieu of the smart grid. For most of the time they have been in existence, power utilities have been protected monopolies, guaranteed a return on investment on anything they could justify adding to the ratemore » base. Such a situation did not encourage innovation, and instead led to widespread risk-avoidance behavior in many utilities. The situation changed at the end of the last century, with a series of regulatory measures, beginning with the Public Utility Regulatory Policy Act of 1978. However, some bad experiences have actually served to strengthen the resistance to innovation by some utilities. Some aspects of the smart grid, such as the addition of computer-based control to the power system, face an uphill battle. It is our position that the addition of TRLs to the decision-making process for smart grid power-system projects, will lead to an environment of more confident adoption.« less
NASA Software Engineering Benchmarking Effort
NASA Technical Reports Server (NTRS)
Godfrey, Sally; Rarick, Heather
2012-01-01
Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA
A new approach for instrument software at Gemini
NASA Astrophysics Data System (ADS)
Gillies, Kim; Nunez, Arturo; Dunn, Jennifer
2008-07-01
Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.
Warning: Projects May Be Closer than They Appear
NASA Technical Reports Server (NTRS)
Africa, Colby
2004-01-01
I had been working for two years as the technical product manager for a large software company, when their partner company gave me a call. They needed good software engineers to customize a new version of software, and they thought I was their guy. They told me what they wanted to do to the software, and they even showed me some prototypes. Their idea was to take the basic software tool that the large company was producing and make it more accessible to the customer. They would do this by building in flexibility based on user skill level and organizational maturity. I thought that was a fascinating approach, and I bought into it in a big way. I decided to leave my job and join up with the smaller company as their director of software engineering.
Your Personal Analysis Toolkit - An Open Source Solution
NASA Astrophysics Data System (ADS)
Mitchell, T.
2009-12-01
Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!
Derbidge, Renatus; Feiten, Linus; Conradt, Oliver; Heusser, Peter; Baumgartner, Stephan
2013-01-01
Photographs of mistletoe (Viscum album L.) berries taken by a permanently fixed camera during their development in autumn were subjected to an outline shape analysis by fitting path curves using a mathematical algorithm from projective geometry. During growth and maturation processes the shape of mistletoe berries can be described by a set of such path curves, making it possible to extract changes of shape using one parameter called Lambda. Lambda describes the outline shape of a path curve. Here we present methods and software to capture and measure these changes of form over time. The present paper describes the software used to automatize a number of tasks including contour recognition, optimization of fitting the contour via hill-climbing, derivation of the path curves, computation of Lambda and blinding the pictures for the operator. The validity of the program is demonstrated by results from three independent measurements showing circadian rhythm in mistletoe berries. The program is available as open source and will be applied in a project to analyze the chronobiology of shape in mistletoe berries and the buds of their host trees. PMID:23565255
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques
NASA Technical Reports Server (NTRS)
Hooke, A. J.
1983-01-01
Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.
CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 3, March 2007
2007-03-01
Capability Maturity Model ® Integration (CMMI®). CMU Software Engineering Institute <www.sei.cmu.edu/cmmi>. 5. ISO /IEC 27001 :2005. Information Security...international standards bodies – International Organization for Standardi- zation ( ISO ) and International Electro- technical Commission (IEC) – are working on a...number of projects that affect soft- ware security: • The ISO Technical Management Board (TMB) performs strategic planning and coordination for ISO
Structural Design of Ares V Interstage Composite Structure
NASA Technical Reports Server (NTRS)
Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.
2011-01-01
Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.
Unidata Cyberinfrastructure in the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Young, J. W.
2016-12-01
Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.
NASA Astrophysics Data System (ADS)
Freire, A. S.; Pinheiro, M. A. A.; Karam-Silva, H.; Teschima, M. M.
2011-09-01
Eleven expeditions were undertaken to the Saint Peter and Saint Paul Archipelago to study the reproductive biology of Grapsus grapsus, providing additional information on limb mutilation and carapace colour. MATURE software was used to estimate morphological maturity, while gonadal analyses were conducted to estimate physiological maturity. The puberty moult took place at larger size in males (51.4 mm of carapace length) than in females (33.8 mm), while physiological maturity occurred at a similar size in males (38.4 mm) and in females (33.4 mm). Above 50 mm, the proportion of red males increased in the population, indicating that functional maturity is also related to colour pattern. Small habitat and high local population density contributed to the high rate of cannibalism. The low diversity of food items, absence of predators of large crabs and high geographic isolation are the determinants of unique behavioural and biological characteristics observed in the G. grapsus population.
ScaffoldSeq: Software for characterization of directed evolution populations.
Woldring, Daniel R; Holec, Patrick V; Hackel, Benjamin J
2016-07-01
ScaffoldSeq is software designed for the numerous applications-including directed evolution analysis-in which a user generates a population of DNA sequences encoding for partially diverse proteins with related functions and would like to characterize the single site and pairwise amino acid frequencies across the population. A common scenario for enzyme maturation, antibody screening, and alternative scaffold engineering involves naïve and evolved populations that contain diversified regions, varying in both sequence and length, within a conserved framework. Analyzing the diversified regions of such populations is facilitated by high-throughput sequencing platforms; however, length variability within these regions (e.g., antibody CDRs) encumbers the alignment process. To overcome this challenge, the ScaffoldSeq algorithm takes advantage of conserved framework sequences to quickly identify diverse regions. Beyond this, unintended biases in sequence frequency are generated throughout the experimental workflow required to evolve and isolate clones of interest prior to DNA sequencing. ScaffoldSeq software uniquely handles this issue by providing tools to quantify and remove background sequences, cluster similar protein families, and dampen the impact of dominant clones. The software produces graphical and tabular summaries for each region of interest, allowing users to evaluate diversity in a site-specific manner as well as identify epistatic pairwise interactions. The code and detailed information are freely available at http://research.cems.umn.edu/hackel. Proteins 2016; 84:869-874. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Proteomics Quality Control: Quality Control Software for MaxQuant Results.
Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan
2016-03-04
Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .
Hatoum-Aslan, Asma; Maniv, Inbal; Marraffini, Luciano A
2011-12-27
Precise RNA processing is fundamental to all small RNA-mediated interference pathways. In prokaryotes, clustered, regularly interspaced, short palindromic repeats (CRISPR) loci encode small CRISPR RNAs (crRNAs) that protect against invasive genetic elements by antisense targeting. CRISPR loci are transcribed as a long precursor that is cleaved within repeat sequences by CRISPR-associated (Cas) proteins. In many organisms, this primary processing generates crRNA intermediates that are subject to additional nucleolytic trimming to render mature crRNAs of specific lengths. The molecular mechanisms underlying this maturation event remain poorly understood. Here, we defined the genetic requirements for crRNA primary processing and maturation in Staphylococcus epidermidis. We show that changes in the position of the primary processing site result in extended or diminished maturation to generate mature crRNAs of constant length. These results indicate that crRNA maturation occurs by a ruler mechanism anchored at the primary processing site. We also show that maturation is mediated by specific cas genes distinct from those genes involved in primary processing, showing that this event is directed by CRISPR/Cas loci.
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Design of monitoring system for mail-sorting based on the Profibus S7 series PLC
NASA Astrophysics Data System (ADS)
Zhang, W.; Jia, S. H.; Wang, Y. H.; Liu, H.; Tang, G. C.
2017-01-01
With the rapid development of the postal express, the workload of mail sorting is increasing, but the automatic technology of mail sorting is not mature enough. In view of this, the system uses Siemens S7-300 PLC as the main station controller, PLC of Siemens S7-200/400 is from the station controller, through the man-machine interface configuration software MCGS, PROFIBUS-DP communication, RFID technology and mechanical sorting hand achieve mail classification sorting monitoring. Among them, distinguish mail-sorting by scanning RFID posted in the mail electronic bar code (fixed code), the system uses the corresponding controller on the acquisition of information processing, the processed information transmit to the sorting manipulator by PROFIBUS-DP. The system can realize accurate and efficient mail sorting, which will promote the development of mail sorting technology.
Rebuilding the space technology base
NASA Technical Reports Server (NTRS)
Povinelli, Frederick P.; Stephenson, Frank W.; Sokoloski, Martin M.; Montemerlo, Melvin D.; Venneri, Samuel L.; Mulville, Daniel R.; Hirschbein, Murray S.; Smith, Paul H.; Schnyer, A. Dan; Lum, Henry
1989-01-01
NASA's Civil Space Technology Initiative (CSTI) will not only develop novel technologies for space exploration and exploitation, but also take mature technologies into their demonstration phase in earth orbit. In the course of five years, CSTI will pay off in ground- and space-tested hardware, software, processes, methods for low-orbit transport and operation, and fundamental scientific research on the orbital environment. Attention is given to LOX/hydrogen and LOX/hydrocarbon reusable engines, liquid/solid fuel hybrid boosters, and aeroassist flight experiments for the validation of aerobraking with atmospheric friction. Also discussed are advanced scientific sensors, systems autonomy and telerobotics, control of flexible structures, precise segmented reflectors, high-rate high-capacity data handling, and advanced nuclear power systems.
2014-01-23
was broken, the willow bent when it must and survived." Robert Jordan, The Fires of Heaven CERT I Software Engineering Institute I...transfer-of-wealth-in-history-7000000598/ 8. Caralli, Richard A.; Allen, Julia H.; White , David W. CERT® Resilience Management Model: A Maturity
Mohammed, Rezwana Begum; Reddy, M Asha Lata; Jain, Megha; Singh, Johar Rajvinder; Sanghvi, Praveen; Thetay, Anshuj Ajay Rao
2014-09-01
In the growing years, indicators of the level of maturational development of the individual provide the best means for evaluating biologic age and the associated timing of skeletal growth. The relative stage of maturity of a child may be determined by comparing the child's hand-wrist radiograph to the known standards of skeletal development. In this study, we assessed various levels of skeletal maturation and also identified the relationship between chronological age (CA) and maturation stage using the hand-wrist radiographs in adolescents of Indian origin. Three hundred and thirty hand-wrist digital radiographs of individuals aged 8 to 18 years were evaluated for skeletal maturity levels using Fishman's method. The data was analysed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). Regression analysis was performed for calculating bone age of both males and females. Spearman's rank-order correlation coefficients were estimated separately for males and females to assess the relation between CA and maturation level. An association between skeletal maturation indicator stages and CA (r = 0.82) was significant. Interestingly, female subjects were observed to be advanced in skeletal maturity compared to males. Regression equations were derived to calculate bone age in males, females and the whole sample. The results of this study showed significant association between hand-wrist skeletal maturation levels and CA. Digital radiographic assessment of hand-wrist skeletal maturation can be used as a better choice for predicting average bone age of an individual because of its simplicity, reliability and lesser radiation exposure.
TkPl_SU: An Open-source Perl Script Builder for Seismic Unix
NASA Astrophysics Data System (ADS)
Lorenzo, J. M.
2017-12-01
TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.
Multi-physics CFD simulations in engineering
NASA Astrophysics Data System (ADS)
Yamamoto, Makoto
2013-08-01
Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.
EPA Scientific Knowledge Management Assessment and ...
A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability is linked to the increased complexity of environmental assessments in the 21st century. This complexity is manifest in the form of problems that require integrated multi-disciplinary solutions. To enable the means to develop these solutions (i.e., science software systems) it is necessary to integrate software developed by disparate groups representing a variety of science domains. Thus, reuse and interoperability becomes imperative. This report briefly describes the chronology of activities conducted by the group of scientists to provide context for the primary purpose of this report, that is, to describe the proceedings and outcomes of the latest activity, a workshop entitled “Workshop on Advancing US EPA integration of environmental and information sciences”. The EPA has been lagging in digital maturity relative to the private sector and even other government agencies. This report helps begin the process of improving the agency’s use of digital technologies, especially in the areas of efficiency and transparency. This report contributes to SHC 1.61.2.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
iPAS: AES Flight System Technology Maturation for Human Spaceflight
NASA Technical Reports Server (NTRS)
Othon, William L.
2014-01-01
In order to realize the vision of expanding human presence in space, NASA will develop new technologies that can enable future crewed spacecraft to go far beyond Earth orbit. These technologies must be matured to the point that future project managers can accept the risk of incorporating them safely and effectively within integrated spacecraft systems, to satisfy very challenging mission requirements. The technologies must also be applied and managed within an operational context that includes both on-board crew and mission support on Earth. The Advanced Exploration Systems (AES) Program is one part of the NASA strategy to identify and develop key capabilities for human spaceflight, and mature them for future use. To support this initiative, the Integrated Power Avionics and Software (iPAS) environment has been developed that allows engineers, crew, and flight operators to mature promising technologies into applicable capabilities, and to assess the value of these capabilities within a space mission context. This paper describes the development of the integration environment to support technology maturation and risk reduction, and offers examples of technology and mission demonstrations executed to date.
Mohammed, Rezwana Begum; Kalyan, V Siva; Tircouveluri, Saritha; Vegesna, Goutham Chakravarthy; Chirla, Anil; Varma, D Maruthi
2014-07-01
Determining the age of a person in the absence of documentary evidence of birth is essential for legal and medico-legal purpose. Fishman method of skeletal maturation is widely used for this purpose; however, the reliability of this method for people with all geographic locations is not well-established. In this study, we assessed various stages of carpal and metacarpal bone maturation and tested the reliability of Fishman method of skeletal maturation to estimate the age in South Indian population. We also evaluated the correlation between the chronological age (CA) and predicted age based on the Fishman method of skeletal maturation. Digital right hand-wrist radiographs of 330 individuals aged 9-20 years were obtained and the skeletal maturity stage for each subject was determined using Fishman method. The skeletal maturation indicator scores were obtained and analyzed with reference to CA and sex. Data was analyzed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). The study subjects had a tendency toward late maturation with the mean skeletal age (SA) estimated being significantly lowers (P < 0.05) than the mean CA at various skeletal maturity stages. Nevertheless, significant correlation was observed in this study between SA and CA for males (r = 0.82) and females (r = 0.85). Interestingly, female subjects were observed to be advanced in SA compared with males. Fishman method of skeletal maturation can be used as an alternative tool for the assessment of mean age of an individual of unknown CA in South Indian children.
Puberty and dispersal in a wild primate population
Onyango, Patrick O.; Gesquiere, Laurence R.; Altmann, Jeanne; Alberts, Susan C.
2013-01-01
The onset of reproduction is preceded by a host of organismal adjustments and transformations, involving morphological, physiological, and behavioral changes. In highly social mammals, including humans and most nonhuman primates, the timing and nature of maturational processes is affected by the animal’s social milieu as well as its ecology. Here, we review a diverse set of findings on how maturation unfolds in wild baboons in the Amboseli basin of southern Kenya, and we place these findings in the context of other reports of maturational processes in primates and other mammals. First, we describe the series of events and processes that signal maturation in female and male baboons. Sex differences in age at both sexual maturity and first reproduction documented for this species are consistent with expectations of life history theory; males mature later than females and exhibit an adolescent growth spurt that is absent or minimal in females. Second, we summarize what we know about sources of variance in the timing of maturational processes including natal dispersal. In Amboseli, individuals in a food-enhanced group mature earlier than their wild-feeding counterparts, and offspring of high-ranking females mature earlier than offspring of low-ranking females. We also report on how genetic admixture, which occurs in Amboseli between two closely related baboon taxa, affects individual maturation schedules. PMID:23998668
Flexible software platform for fast-scan cyclic voltammetry data acquisition and analysis.
Bucher, Elizabeth S; Brooks, Kenneth; Verber, Matthew D; Keithley, Richard B; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J; Wightman, R Mark
2013-11-05
Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology, and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite that includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with transistor-transistor logic-based systems that are used for monitoring animal behavior, and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both subsecond and multiminute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments.
Mychasiuk, Richelle; Metz, Gerlinde A S
2016-11-01
Adolescence is defined as the gradual period of transition between childhood and adulthood that is characterized by significant brain maturation, growth spurts, sexual maturation, and heightened social interaction. Although originally believed to be a uniquely human aspect of development, rodent and non-human primates demonstrate maturational patterns that distinctly support an adolescent stage. As epigenetic processes are essential for development and differentiation, but also transpire in mature cells in response to environmental influences, they are an important aspect of adolescent brain maturation. The purpose of this review article was to examine epigenetic programming in animal models of brain maturation during adolescence. The discussion focuses on animal models to examine three main concepts; epigenetic processes involved in normal adolescent brain maturation, the influence of fetal programming on adolescent brain development and the epigenome, and finally, postnatal experiences such as exercise and drugs that modify epigenetic processes important for adolescent brain maturation. This corollary emphasizes the utility of animal models to further our understanding of complex processes such as epigenetic regulation and brain development. Copyright © 2016 Elsevier Ltd. All rights reserved.
Heterogeneity of shale documented by micro-FTIR and image analysis.
Chen, Yanyan; Mastalerz, Maria; Schimmelmann, Arndt
2014-12-01
In this study, four New Albany Shale Devonian and Mississippian samples, with vitrinite reflectance [Ro ] values ranging from 0.55% to 1.41%, were analyzed by micro-FTIR mapping of chemical and mineralogical properties. One additional postmature shale sample from the Haynesville Shale (Kimmeridgian, Ro = 3.0%) was included to test the limitation of the method for more mature substrates. Relative abundances of organic matter and mineral groups (carbonates, quartz and clays) were mapped across selected microscale regions based on characteristic infrared peaks and demonstrated to be consistent with corresponding bulk compositional percentages. Mapped distributions of organic matter provide information on the organic matter abundance and the connectivity of organic matter within the overall shale matrix. The pervasive distribution of organic matter mapped in the New Albany Shale sample MM4 is in agreement with this shale's high total organic carbon abundance relative to other samples. Mapped interconnectivity of organic matter domains in New Albany Shale samples is excellent in two early mature shale samples having Ro values from 0.55% to 0.65%, then dramatically decreases in a late mature sample having an intermediate Ro of 1.15% and finally increases again in the postmature sample, which has a Ro of 1.41%. Swanson permeabilities, derived from independent mercury intrusion capillary pressure porosimetry measurements, follow the same trend among the four New Albany Shale samples, suggesting that micro-FTIR, in combination with complementary porosimetric techniques, strengthens our understanding of porosity networks. In addition, image processing and analysis software (e.g. ImageJ) have the capability to quantify organic matter and total organic carbon - valuable parameters for highly mature rocks, because they cannot be analyzed by micro-FTIR owing to the weakness of the aliphatic carbon-hydrogen signal. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
Andrae, J G; Hunt, C W; Pritchard, G T; Kennington, L R; Harrison, J H; Kezar, W; Mahanna, W
2001-09-01
A study involving a 2 x 2 x 2 factorial arrangement of treatments was conducted to evaluate effects of hybrid (Pioneer 3335 and 3489), maturity (half milkline and blacklayer), and mechanical processing (field chopper with and without on-board rollers engaged) on intake and digestibility of corn silage. Forty Angus steers (322 +/- 5.2 kg BW) were assigned to the eight silage treatments (five steers per treatment) and individually fed using electronic gates. Diets consisted of 60% corn silage and 40% chopped alfalfa hay (DM basis). Following a 5-d adaptation period, intake was measured for 7 d and subsequently fecal samples were collected for 5 d. Chromic oxide (5 g/d) was fed beginning 7 d before fecal sample collection and digestibility was determined by the ratio of Cr in the feed and feces. Steers were reallocated to treatments and these procedures were repeated, providing 10 observations per treatment. In addition, all silages were ruminally incubated in six mature cows for 0, 8, 16, 24, 48, and 96 h to determine extent and rate of DM, starch, NDF, and ADF disappearance. Processing increased DMI of hybrid 3489 but did not affect DMI of hybrid 3335 (hybrid x processing; P < 0.06). Total tract digestibility of DM, starch, NDF, and ADF decreased (P < 0.01) as plant maturity increased. Maturity tended to decrease starch digestibility more for hybrid 3489 than for hybrid 3335 (hybrid x maturity; P < 0.10). Processing increased (P < 0.01) starch digestibility but decreased (P < 0.01) NDF and ADF digestibility, resulting in no processing effect on DM digestibility. There was a numerical trend for processing to increase starch digestibility more for latethan for early-maturity corn silage (maturity x processing; P = 0.11). Processing increased in situ rates of DM and starch disappearance and maturity decreased in situ disappearance rates of starch and fiber. These data indicate that hybrid, maturity, and processing all affect corn silage digestibility. Mechanical processing of corn silage increased starch digestibility, which may have been associated with the observed decreased fiber digestibility.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.
In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway
NASA Astrophysics Data System (ADS)
Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun
2016-12-01
HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.
E-learning process maturity level: a conceptual framework
NASA Astrophysics Data System (ADS)
Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.
2018-03-01
ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.
Development of a Common User Interface for the Launch Decision Support System
NASA Technical Reports Server (NTRS)
Scholtz, Jean C.
1991-01-01
The Launch Decision Support System (LDSS) is software to be used by the NASA Test Director (NTD) in the firing room during countdown. This software is designed to assist the NTD with time management, that is, when to resume from a hold condition. This software will assist the NTD in making and evaluating alternate plans and will keep him advised of the existing situation. As such, the interface to this software must be designed to provide the maximum amount of information in the clearest fashion and in a timely manner. This research involves applying user interface guidelines to a mature prototype of LDSS and developing displays that will enable the users to easily and efficiently obtain information from the LDSS displays. This research also extends previous work on organizing and prioritizing human-computer interaction knowledge.
NASA Technical Reports Server (NTRS)
MIittman, David S
2011-01-01
Ensemble is an open architecture for the development, integration, and deployment of mission operations software. Fundamentally, it is an adaptation of the Eclipse Rich Client Platform (RCP), a widespread, stable, and supported framework for component-based application development. By capitalizing on the maturity and availability of the Eclipse RCP, Ensemble offers a low-risk, politically neutral path towards a tighter integration of operations tools. The Ensemble project is a highly successful, ongoing collaboration among NASA Centers. Since 2004, the Ensemble project has supported the development of mission operations software for NASA's Exploration Systems, Science, and Space Operations Directorates.
HyperCard and Other Macintosh Applications in Astronomy Education
NASA Astrophysics Data System (ADS)
Meisel, D.
1992-12-01
For the past six years, Macintosh computers have been used in introductory astronomy classes and laboratories with HyperCard and other commercial Macintosh software. I will review some of the available software that has been found particularly useful in undergraduate situations. The review will start with HyperCard (a programmable "index card" system) since it is a mature multimedia platform for the Macintosh. Experiences with the Voyager, the TS-24, MathCad, NIH Image, and other programs as used by the author and George Mumford (Tufts University) in courses and workshops will be described.
Open source approaches to health information systems in Kenya.
Drury, Peter; Dahlman, Bruce
2005-01-01
This paper focuses on the experience to date of an installation of a Free Open Source Software (FOSS) product, Care2X, at a church hospital in Kenya. The FOSS movement has been maturing rapidly. In developed countries, its benefits relative to proprietary software have been extensively discussed and ways of quantifying the total costs of the development have been developed. Nevertheless, empirical data on the impact of FOSS, particularly in the developing world, concerning its use and development is still quite limited, although the possibilities of FOSS are becoming increasingly attractive.
Surviving Nuclear Winter Towards a Service-Led Business
NASA Astrophysics Data System (ADS)
Rocha, Michael; Chou, Timothy
During the tech-led recession in 2001 a little known transformation occurred at the world's largest business software company. This transformation was led by a realization that existing customers of mature software need service of the products they purchased more than just purchasing new products. Organizing around the installed base of customers both defined new organizations, as well as new technology to power the specialists. This paper both gives a glimpse of the Oracle transformation as well as lays out some fundamental tenants of anyone interested in a service-led business.
Flexibility of Bricard's linkages and other structures via resultants and computer algebra.
Lewis, Robert H; Coutsias, Evangelos A
2016-07-01
Flexibility of structures is extremely important for chemistry and robotics. Following our earlier work, we study flexibility using polynomial equations, resultants, and a symbolic algorithm of our creation that analyzes the resultant. We show that the software solves a classic arrangement of quadrilaterals in the plane due to Bricard. We fill in several gaps in Bricard's work and discover new flexible arrangements that he was apparently unaware of. This provides strong evidence for the maturity of the software, and is a wonderful example of mathematical discovery via computer assisted experiment.
Checkley, Mary Ann; Luttge, Benjamin G; Soheilian, Ferri; Nagashima, Kunio; Freed, Eric O
2010-04-25
The human immunodeficiency virus type 1 (HIV-1) maturation inhibitor bevirimat disrupts virus replication by inhibiting the cleavage of the capsid-spacer peptide 1 (CA-SP1) Gag processing intermediate to mature CA. The observation that bevirimat delays but does not completely block CA-SP1 processing suggests that the presence of uncleaved CA-SP1 may disrupt the maturation process in trans. In this study, we validate this hypothesis by using a genetic approach to demonstrate that a non-cleavable CA-SP1 mutant exerts a dominant-negative effect on maturation of wild-type HIV-1. In contrast, a mutant in which cleavage can occur internally within SP1 is significantly less potent as a dominant-negative inhibitor. We also show that bevirimat blocks processing at both the major CA-SP1 cleavage site and the internal site. These data underscore the importance of full CA-SP1 processing for HIV-1 maturation and highlight the therapeutic potential of inhibitors that target this Gag cleavage event. Published by Elsevier Inc.
Autophagy proteins are not universally required for phagosome maturation.
Cemma, Marija; Grinstein, Sergio; Brumell, John H
2016-09-01
Phagocytosis plays a central role in immunity and tissue homeostasis. After internalization of cargo into single-membrane phagosomes, these compartments undergo a maturation sequences that terminates in lysosome fusion and cargo degradation. Components of the autophagy pathway have recently been linked to phagosome maturation in a process called LC3-associated phagocytosis (LAP). In this process, autophagy machinery is thought to conjugate LC3 directly onto the phagosomal membrane to promote lysosome fusion. However, a recent study has suggested that ATG proteins may in fact impair phagosome maturation to promote antigen presentation. Here, we examined the impact of ATG proteins on phagosome maturation in murine cells using FCGR2A/FcγR-dependent phagocytosis as a model. We show that phagosome maturation is not affected in Atg5-deficient mouse embryonic fibroblasts, or in Atg5- or Atg7-deficient bone marrow-derived macrophages using standard assays of phagosome maturation. We propose that ATG proteins may be required for phagosome maturation under some conditions, but are not universally required for this process.
Alves, David; Mato, Salustiano
2016-01-01
In general, in composting facilities the active, or intensive, stage of the process is done separately from the maturation stage, using a specific technology and time. The pre-composted material to be matured can contain enough biodegradable substrates to cause microbial proliferation, which in turn can cause temperatures to increase. Therefore, not controlling the maturation period during waste management at an industrial level can result in undesired outcomes. The main hypothesis of this study is that controlling the maturation stage through turning provides one with an optimized process when compared to the static approach. The waste used was sludge from a seafood-processing plant, mixed with shredded wood (1:2, v/v). The composting system consists of an intensive stage in a 600L static reactor, followed by maturation in triplicate in 200L boxes for 112 days. Two tests were carried out with the same process in reactor and different treatments in boxes: static maturation and turning during maturation when the temperature went above 55°C. PLFAs, organic matter, pH, electrical conductivity, forms of nitrogen and carbon, hydrolytic enzymes and respiratory activity were periodically measured. Turning significantly increased the duration of the thermophilic phase and consequently increased the organic-matter degradation. PCA differentiated significantly the two treatments in function of tracking parameters, especially pH, total carbon, forms of nitrogen and C/N ratio. So, stability and maturity optimum values for compost were achieved in less time with turnings. Whereas turning resulted in microbial-group stabilization and a low mono/sat ratio, static treatment produced greater variability in microbial groups and a high mono/sat ratio, the presence of more degradable substrates causes changes in microbial communities and their study during maturation gives an approach of the state of organic-matter degradation. Obtaining quality compost and optimizing the composting process requires using turning as a control mechanism during maturation. PMID:28002444
A conceptual framework and classification of capability areas for business process maturity
NASA Astrophysics Data System (ADS)
Van Looy, Amy; De Backer, Manu; Poels, Geert
2014-03-01
The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.
Mohammed, Rezwana Begum; Kalyan, V. Siva; Tircouveluri, Saritha; Vegesna, Goutham Chakravarthy; Chirla, Anil; Varma, D. Maruthi
2014-01-01
Introduction: Determining the age of a person in the absence of documentary evidence of birth is essential for legal and medico-legal purpose. Fishman method of skeletal maturation is widely used for this purpose; however, the reliability of this method for people with all geographic locations is not well-established. Aims and Objectives: In this study, we assessed various stages of carpal and metacarpal bone maturation and tested the reliability of Fishman method of skeletal maturation to estimate the age in South Indian population. We also evaluated the correlation between the chronological age (CA) and predicted age based on the Fishman method of skeletal maturation. Materials and Methods: Digital right hand-wrist radiographs of 330 individuals aged 9-20 years were obtained and the skeletal maturity stage for each subject was determined using Fishman method. The skeletal maturation indicator scores were obtained and analyzed with reference to CA and sex. Data was analyzed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). Results: The study subjects had a tendency toward late maturation with the mean skeletal age (SA) estimated being significantly lowers (P < 0.05) than the mean CA at various skeletal maturity stages. Nevertheless, significant correlation was observed in this study between SA and CA for males (r = 0.82) and females (r = 0.85). Interestingly, female subjects were observed to be advanced in SA compared with males. Conclusion: Fishman method of skeletal maturation can be used as an alternative tool for the assessment of mean age of an individual of unknown CA in South Indian children. PMID:25097402
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W.
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures. PMID:26904094
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations.
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures.
Does implied volatility of currency futures option imply volatility of exchange rates?
NASA Astrophysics Data System (ADS)
Wang, Alan T.
2007-02-01
By investigating currency futures options, this paper provides an alternative economic implication for the result reported by Stein [Overreactions in the options market, Journal of Finance 44 (1989) 1011-1023] that long-maturity options tend to overreact to changes in the implied volatility of short-maturity options. When a GARCH process is assumed for exchange rates, a continuous-time relationship is developed. We provide evidence that implied volatilities may not be the simple average of future expected volatilities. By comparing the term-structure relationship of implied volatilities with the process of the underlying exchange rates, we find that long-maturity options are more consistent with the exchange rates process. In sum, short-maturity options overreact to the dynamics of underlying assets rather than long-maturity options overreacting to short-maturity options.
Surveillance Jumps on the Network
ERIC Educational Resources Information Center
Raths, David
2011-01-01
Internet protocol (IP) network-based cameras and digital video management software are maturing, and many issues that have surrounded them, including bandwidth, data storage, ease of use, and integration are starting to become clearer as the technology continues to evolve. Prices are going down and the number of features is going up. Many school…
A Statistics Curriculum for the Undergraduate Chemistry Major
ERIC Educational Resources Information Center
Schlotter, Nicholas E.
2013-01-01
Our ability to statistically analyze data has grown significantly with the maturing of computer hardware and software. However, the evolution of our statistics capabilities has taken place without a corresponding evolution in the curriculum for the undergraduate chemistry major. Most faculty understands the need for a statistical educational…
ERIC Educational Resources Information Center
Malina, Robert M.
2014-01-01
Growth, maturation, and development dominate the daily lives of children and adolescents for approximately the first 2 decades of life. Growth and maturation are biological processes, while development is largely a behavioral process. The 3 processes occur simultaneously and interact. They can be influenced by physical activity and also can…
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Mature. 51.1313 Section 51.1313 Agriculture... Standards for Winter Pears 1 Definitions § 51.1313 Mature. (a) Mature means that the pear has reached the stage of maturity which will insure the proper completion of the ripening process. (b) Before a mature...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Mature. 51.1313 Section 51.1313 Agriculture... Standards for Winter Pears 1 Definitions § 51.1313 Mature. (a) Mature means that the pear has reached the stage of maturity which will insure the proper completion of the ripening process. (b) Before a mature...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Mature. 51.1313 Section 51.1313 Agriculture... Standards for Winter Pears 1 Definitions § 51.1313 Mature. (a) Mature means that the pear has reached the stage of maturity which will insure the proper completion of the ripening process. (b) Before a mature...
NASA Technical Reports Server (NTRS)
Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.
2014-01-01
As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.
Onboard Processor for Compressing HSI Data
NASA Technical Reports Server (NTRS)
Cook, Sid; Harsanyi, Joe; Day, John H. (Technical Monitor)
2002-01-01
With EO-1 Hyperion and MightySat in orbit NASA and the DoD are showing their continued commitment to hyperspectral imaging (HSI). As HSI sensor technology continues to mature, the ever-increasing amounts of sensor data generated will result in a need for more cost effective communication and data handling systems. Lockheed Martin, with considerable experience in spacecraft design and developing special purpose onboard processors, has teamed with Applied Signal & Image Technology (ASIT), who has an extensive heritage in HSI, to develop a real-time and intelligent onboard processing (OBP) system to reduce HSI sensor downlink requirements. Our goal is to reduce the downlink requirement by a factor greater than 100, while retaining the necessary spectral fidelity of the sensor data needed to satisfy the many science, military, and intelligence goals of these systems. Our initial spectral compression experiments leverage commercial-off-the-shelf (COTS) spectral exploitation algorithms for segmentation, material identification and spectral compression that ASIT has developed. ASIT will also support the modification and integration of this COTS software into the OBP. Other commercially available COTS software for spatial compression will also be employed as part of the overall compression processing sequence. Over the next year elements of a high-performance reconfigurable OBP will be developed to implement proven preprocessing steps that distill the HSI data stream in both spectral and spatial dimensions. The system will intelligently reduce the volume of data that must be stored, transmitted to the ground, and processed while minimizing the loss of information.
Design Considerations of a Virtual Laboratory for Advanced X-ray Sources
NASA Astrophysics Data System (ADS)
Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.
2004-11-01
The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.
NASA Astrophysics Data System (ADS)
Ferrell, James E.; Xiong, Wen
2001-03-01
Xenopus oocyte maturation is an example of an all-or-none, irreversible cell fate induction process. In response to a submaximal concentration of the steroid hormone progesterone, a given oocyte may either mature or not mature, but it can exist in intermediate states only transiently. Moreover, once an oocyte has matured, it will remain arrested in the mature state even after the progesterone is removed. It has been hypothesized that the all-or-none character of oocyte maturation, and some aspects of the irreversibility of maturation, arise out of the bistability of the signal transduction system that triggers maturation. The bistability, in turn, is hypothesized to arise from the way the signal transducers are organized into a signaling circuit that includes positive feedback (which makes it so that the system cannot rest in intermediate states) and ultrasensitivity (which filters small stimuli out of the feedback loop, allowing the system to have a stable off-state). Here we review two simple graphical methods that are commonly used to analyze bistable systems, discuss the experimental evidence for bistability in oocyte maturation, and suggest that bistability may be a common means of producing all-or-none responses and a type of biochemical memory.
USDA-ARS?s Scientific Manuscript database
Clonal cultures of pig-derived mature adipocytes are capable of dedifferentiating and forming proliferative-competent progeny cells in vitro. Initial lipid processing, is different to that observed in cultures of beef-derived adipocytes. Mature pig adipocytes extrude lipid before proliferation, wher...
Predictive Capability Maturity Model for computational modeling and simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.
2007-10-01
The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less
Color back projection for fruit maturity evaluation
NASA Astrophysics Data System (ADS)
Zhang, Dong; Lee, Dah-Jye; Desai, Alok
2013-12-01
In general, fruits and vegetables such as tomatoes and dates are harvested before they fully ripen. After harvesting, they continue to ripen and their color changes. Color is a good indicator of fruit maturity. For example, tomatoes change color from dark green to light green and then pink, light red, and dark red. Assessing tomato maturity helps maximize its shelf life. Color is used to determine the length of time the tomatoes can be transported. Medjool dates change color from green to yellow, and the orange, light red and dark red. Assessing date maturity helps determine the length of drying process to help ripen the dates. Color evaluation is an important step in the processing and inventory control of fruits and vegetables that directly affects profitability. This paper presents an efficient color back projection and image processing technique that is designed specifically for real-time maturity evaluation of fruits. This color processing method requires very simple training procedure to obtain the frequencies of colors that appear in each maturity stage. This color statistics is used to back project colors to predefined color indexes. Fruit maturity is then evaluated by analyzing the reprojected color indexes. This method has been implemented and used for commercial production.
Software Supply Chain Risk Management: From Products to Systems of Systems
2010-12-01
an n 2009 Graw er the , has linked on, the Maturity e authors l services o agreed icrosoft, gie CMU/SEI-2010-TN-026 | 12 Model...threat modeling is a part of Microsoft’s SDL [Howard 2006, Swiderski 2004]. Stephen Lipner has designated it as the most important part of the
ERIC Educational Resources Information Center
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-01-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Conlan
Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less
Radiation Hardening by Software Techniques on FPGAs: Flight Experiment Evaluation and Results
NASA Technical Reports Server (NTRS)
Schmidt, Andrew G.; Flatley, Thomas
2017-01-01
We present our work on implementing Radiation Hardening by Software (RHBSW) techniques on the Xilinx Virtex5 FPGAs PowerPC 440 processors on the SpaceCube 2.0 platform. The techniques have been matured and tested through simulation modeling, fault emulation, laser fault injection and now in a flight experiment, as part of the Space Test Program- Houston 4-ISS SpaceCube Experiment 2.0 (STP-H4-ISE 2.0). This work leverages concepts such as heartbeat monitoring, control flow assertions, and checkpointing, commonly used in the High Performance Computing industry, and adapts them for use in remote sensing embedded systems. These techniques are extremely low overhead (typically <1.3%), enabling a 3.3x gain in processing performance as compared to the equivalent traditionally radiation hardened processor. The recently concluded STP-H4 flight experiment was an opportunity to upgrade the RHBSW techniques for the Virtex5 FPGA and demonstrate them on-board the ISS to achieve TRL 7. This work details the implementation of the RHBSW techniques, that were previously developed for the Virtex4-based SpaceCube 1.0 platform, on the Virtex5-based SpaceCube 2.0 flight platform. The evaluation spans the development and integration with flight software, remotely uploading the new experiment to the ISS SpaceCube 2.0 platform, and conducting the experiment continuously for 16 days before the platform was decommissioned. The experiment was conducted on two PowerPCs embedded within the Virtex5 FPGA devices and the experiment collected 19,400 checkpoints, processed 253,482 status messages, and incurred 0 faults. These results are highly encouraging and future work is looking into longer duration testing as part of the STP-H5 flight experiment.
Post-thymic maturation: young T cells assert their individuality.
Fink, Pamela J; Hendricks, Deborah W
2011-07-22
T cell maturation was once thought to occur entirely within the thymus. Now, evidence is mounting that the youngest peripheral T cells in both mice and humans comprise a distinct population from their more mature, yet still naive, counterparts. These cells, termed recent thymic emigrants (RTEs), undergo a process of post-thymic maturation that can be monitored at the levels of cell phenotype and immune function. Understanding this final maturation step in the process of generating useful and safe T cells is of clinical relevance, given that RTEs are over-represented in neonates and in adults recovering from lymphopenia. Post-thymic maturation may function to ensure T cell fitness and self tolerance.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Petrick, David J.; Day, John H. (Technical Monitor)
2001-01-01
Spacecraft telemetry rates have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image processing application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms and re-configurable computing hardware technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processing (DSP). It has been shown in [1] and [2] that this configuration can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft. However, since this technology is still maturing, intensive pre-hardware steps are necessary to achieve the benefits of hardware implementation. This paper describes these steps for the GOES-8 application, a software project developed using Interactive Data Language (IDL) (Trademark of Research Systems, Inc.) on a Workstation/UNIX platform. The solution involves converting the application to a PC/Windows/RC platform, selected mainly by the availability of low cost, adaptable high-speed RC hardware. In order for the hybrid system to run, the IDL software was modified to account for platform differences. It was interesting to examine the gains and losses in performance on the new platform, as well as unexpected observations before implementing hardware. After substantial pre-hardware optimization steps, the necessity of hardware implementation for bottleneck code in the PC environment became evident and solvable beginning with the methodology described in [1], [2], and implementing a novel methodology for this specific application [6]. The PC-RC interface bandwidth problem for the class of applications with moderate input-output data rates but large intermediate multi-thread data streams has been addressed and mitigated. This opens a new class of satellite image processing applications for bottleneck problems solution using RC technologies. The issue of a science algorithm level of abstraction necessary for RC hardware implementation is also described. Selected Matlab functions already implemented in hardware were investigated for their direct applicability to the GOES-8 application with the intent to create a library of Matlab and IDL RC functions for ongoing work. A complete class of spacecraft image processing applications using embedded re-configurable computing technology to meet real-time requirements, including performance results and comparison with the existing system, is described in this paper.
Liang, Zhongguan; Liu, Weiqing; Chen, Jun; Hu, Linhua; Dai, Songyuan
2015-01-21
After injection of electrolyte, the internal three-dimensional solid-liquid penetration system of dye-sensitized solar cells (DSCs) can take a period of time to reach "mature" state. This paper studies the changes of microscopic processes of DSCs including TiO2 energy-level movement, localized state distribution, charge accumulation, electron transport, and recombination dynamics, from the beginning of electrolyte injection to the time of reached mature state. The results show that the microscopic dynamics process of DSCs exhibited a time-dependent behavior and achieved maturity ∼12 h after injecting the electrolyte into DSCs. Within 0-12 h, several results were observed: (1) the conduction band edge of TiO2 moved slightly toward negative potential direction; (2) the localized states in the band gap of TiO2 was reduced according to the same distribution law; (3) the transport resistance in TiO2 film increased, and electron transport time was prolonged as the time of maturity went on, which indicated that the electron transport process is impeded gradually; (4) the recombination resistance at the TiO2/electrolyte (EL) interface increases, and electron lifetime gradually extends, therefore, the recombination process is continuously suppressed. Furthermore, results suggest that the parameters of EL/Pt-transparent conductive oxide (TCO) interface including the interfacial capacitance, electron-transfer resistance, and transfer time constant would change with time of maturity, indicating that the EL/Pt-TCO interface is a potential factor affecting the mature process of DSCs.
Unified Approach to Modeling and Simulation of Space Communication Networks and Systems
NASA Technical Reports Server (NTRS)
Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth
2010-01-01
Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks
Vidaña-Pérez, Dèsirée; Braverman-Bronstein, Ariela; Basto-Abreu, Ana; Barrientos-Gutierrez, Inti; Hilscher, Rainer; Barrientos-Gutierrez, Tonatiuh
2018-01-11
Background: Video games are widely used by children and adolescents and have become a significant source of exposure to sexual content. Despite evidence of the important role of media in the development of sexual attitudes and behaviours, little attention has been paid to monitor sexual content in video games. Methods: Data was obtained about sexual content and rating for 23722 video games from 1994 to 2013 from the Entertainment Software Rating Board database; release dates and information on the top 100 selling video games was also obtained. A yearly prevalence of sexual content according to rating categories was calculated. Trends and comparisons were estimated using Joinpoint regression. Results: Sexual content was present in 13% of the video games. Games rated 'Mature' had the highest prevalence of sexual content (34.5%) followed by 'Teen' (30.7%) and 'E10+' (21.3%). Over time, sexual content decreased in the 'Everyone' category, 'E10+' maintained a low prevalence and 'Teen' and 'Mature' showed a marked increase. Both top and non-top video games showed constant increases, with top selling video games having 10.1% more sexual content across the period of study. Conclusion: Over the last 20 years, the prevalence of sexual content has increased in video games with a 'Teen' or 'Mature' rating. Further studies are needed to quantify the potential association between sexual content in video games and sexual behaviour in children and adolescents.
A UML-based metamodel for software evolution process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing
2014-04-01
A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.
Villar, Iria; Alves, David; Garrido, Josefina; Mato, Salustiano
2016-08-01
During composting, facilities usually exert greater control over the bio-oxidative phase of the process, which uses a specific technology and generally has a fixed duration. After this phase, the material is deposited to mature, with less monitoring during the maturation phase. While there has been considerable study of biological parameters during the thermophilic phase, there is less research on the stabilization and maturation phase. This study evaluates the effects of the type of starting material on the evolution of microbial dynamics during the maturation phase of composting. Three waste types were used: sludge from the fish processing industry, municipal sewage sludge and pig manure, each independently mixed with shredded pine wood as bulking agent. The composting system for each waste type comprised a static reactor with capacity of 600L for the bio-oxidative phase followed by stabilization and maturation phase in triplicate 200L boxes for 112days. Phospholipid fatty acids, enzyme activities and physico-chemical parameters were measured throughout the maturation phase. The evolution of the total microbial biomass, Gram + bacteria, Gram - bacteria, fungi and enzymatic activities (β-glucosidase, cellulase, protease, acid and alkaline phosphatase) depended significantly on the waste type (p<0.001). The predominant microbial community for each waste type remained present throughout the maturation process, indicating that the waste type determines the microorganisms that are able to develop at this stage. While fungi predominated during fish sludge maturation, manure and municipal sludge were characterized by a greater proportion of bacteria. Both the structure of the microbial community and enzymatic activities provided important information for monitoring the composting process. More attention should be paid to the maturation phase in order to optimize composting. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Automated reuseable components system study results
NASA Technical Reports Server (NTRS)
Gilroy, Kathy
1989-01-01
The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.
NASA Astrophysics Data System (ADS)
Wang, Qiang
2017-09-01
As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.
Quality Management Systems Implementation Compared With Organizational Maturity in Hospital.
Moradi, Tayebeh; Jafari, Mehdi; Maleki, Mohammad Reza; Naghdi, Seyran; Ghiasvand, Hesam
2015-07-27
A quality management system can provide a framework for continuous improvement in order to increase the probability of customers and other stakeholders' satisfaction. The test maturity model helps organizations to assess the degree of maturity in implementing effective and sustained quality management systems; plan based on the current realities of the organization and prioritize their improvement programs. We aim to investigate and compare the level of organizational maturity in hospitals with the status of quality management systems implementation. This analytical cross sectional study was conducted among hospital administrators and quality experts working in hospitals with over 200 beds located in Tehran. In the first step, 32 hospitals were selected and then 96 employees working in the selected hospitals were studied. The data were gathered using the implementation checklist of quality management systems and the organization maturity questionnaire derived from ISO 10014. The content validity was calculated using Lawshe method and the reliability was estimated using test - retest method and calculation of Cronbach's alpha coefficient. The descriptive and inferential statistics were used to analyze the data using SPSS 18 software. According to the table, the mean score of organizational maturity among hospitals in the first stage of quality management systems implementation was equal to those in the third stage and hypothesis was rejected (p-value = 0.093). In general, there is no significant difference in the organizational maturity between the first and third level hospitals (in terms of implementation of quality management systems). Overall, the findings of the study show that there is no significant difference in the organizational maturity between the hospitals in different levels of the quality management systems implementation and in fact, the maturity of the organizations cannot be attributed to the implementation of such systems. As a result, hospitals should make changes in the quantity and quality of quality management systems in an effort to increase organizational maturity, whereby they improve the hospital efficiency and productivity.
Quality Management Systems Implementation Compared With Organizational Maturity in Hospital
Moradi, Tayebeh; Jafari, Mehdi; Maleki, Mohammad Reza; Naghdi, Seyran; Ghiyasvand, Hesam
2016-01-01
Background: A quality management system can provide a framework for continuous improvement in order to increase the probability of customers and other stakeholders’ satisfaction. The test maturity model helps organizations to assess the degree of maturity in implementing effective and sustained quality management systems; plan based on the current realities of the organization and prioritize their improvement programs. Objectives: We aim to investigate and compare the level of organizational maturity in hospitals with the status of quality management systems implementation. Materials and Methods: This analytical cross sectional study was conducted among hospital administrators and quality experts working in hospitals with over 200 beds located in Tehran. In the first step, 32 hospitals were selected and then 96 employees working in the selected hospitals were studied. The data were gathered using the implementation checklist of quality management systems and the organization maturity questionnaire derived from ISO 10014. The content validity was calculated using Lawshe method and the reliability was estimated using test - retest method and calculation of Cronbach's alpha coefficient. The descriptive and inferential statistics were used to analyze the data using SPSS 18 software. Results: According to the table, the mean score of organizational maturity among hospitals in the first stage of quality management systems implementation was equal to those in the third stage and hypothesis was rejected (p-value = 0.093). In general, there is no significant difference in the organizational maturity between the first and third level hospitals (in terms of implementation of quality management systems). Conclusions: Overall, the findings of the study show that there is no significant difference in the organizational maturity between the hospitals in different levels of the quality management systems implementation and in fact, the maturity of the organizations cannot be attributed to the implementation of such systems. As a result, hospitals should make changes in the quantity and quality of quality management systems in an effort to increase organizational maturity, whereby they improve the hospital efficiency and productivity. PMID:26493411
Using the Assessment Process to Overcome Imposter Syndrome in Mature Students
ERIC Educational Resources Information Center
Chapman, Amanda
2017-01-01
This research draws on the experience of a group of mature students' studies during their first year at university. All experienced varying degrees of Imposter Syndrome, feelings of fraudulence and a lack of confidence in their ability. The process of "becoming" a mature student is one of identity change and risk. Gaining a sense of…
Mobile Technology in 2020: Predictions and Implications for K-12 Education
ERIC Educational Resources Information Center
Norris, Cathleen A.; Soloway, Elliot
2015-01-01
While "mobile learning" has gained recognition in K-12 as a category in educational technology, the authors argue that, between 2010 and 2015, at least, its impact hasn't matched the hype. But between 2015 and 2020, hardware, software, and network technologies will mature sufficiently such that educational technology's Holy…
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1995-01-01
Grid related issues of the Chimera overset grid method are discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is considered. Current limitations of the approach are identified.
M-Rated Video Games and Aggressive or Problem Behavior among Young Adolescents
ERIC Educational Resources Information Center
Olson, Cheryl K.; Kutner, Lawrence A.; Baer, Lee; Beresin, Eugene V.; Warner, Dorothy E.; Nicholi, Armand M., II
2009-01-01
This research examined the potential relationship between adolescent problem behaviors and amount of time spent with violent electronic games. Survey data were collected from 1,254 7th and 8th grade students in two states. A "dose" of exposure to Mature-rated games was calculated using Entertainment Software Rating Board ratings of…
Techakanon, Chukwan; Gradziel, Thomas M; Zhang, Lu; Barrett, Diane M
2016-09-28
Fruit maturity is an important factor associated with final product quality, and it may have an effect on the level of browning in peaches that are high pressure processed (HPP). Peaches from three different maturities, as determined by firmness (M1 = 50-55 N, M2 = 35-40 N, and M3 = 15-20 N), were subjected to pressure levels at 0.1, 200, and 400 MPa for 10 min. The damage from HPP treatment results in loss of fruit integrity and the development of browning during storage. Increasing pressure levels of HPP treatment resulted in greater damage, particularly in the more mature peaches, as determined by shifts in transverse relaxation time (T2) of the vacuolar component and by light microscopy. The discoloration of peach slices of different maturities processed at the same pressure was comparable, indicating that the effect of pressure level is greater than that of maturity in the development of browning.
Cohen, Y; Steppuhn, J; Herrmann, R G; Yalovsky, S; Nechushtai, R
1992-01-01
The biogenesis and assembly of subunit II of photosystem I (PSI) (psaD gene product) were studied and characterized. The precursor and the mature form were produced in vitro and incubated with intact plastids or isolated thylakoids. Following import of the precursor into isolated plastids, mostly the mature form of subunit II was found in the thylakoids. However, when the processing activity was inhibited only the precursor form was present in the membranes. The precursor was processed by a stromal peptidase and processing could occur before or after insertion of the precursor into the thylakoids. Following insertion into isolated thylakoids, both the precursor and the mature form of subunit II were confined to the PSI complex. Insertion of the mature form of subunit II was much less efficient than that of the precursor. Kinetic studies showed that the precursor was inserted into the membrane. Only at a later stage, the mature form began to accumulate. These results suggest that in vivo the precursor of subunit II is inserted and embedded in the thylakoids, as part of the PSI complex. Only later, it is processed to the mature form through the action of a stromal peptidase. Images PMID:1740118
Retrovirus maturation-an extraordinary structural transformation.
Mattei, Simone; Schur, Florian Km; Briggs, John Ag
2016-06-01
Retroviruses such as HIV-1 assemble and bud from infected cells in an immature, non-infectious form. Subsequently, a series of proteolytic cleavages catalysed by the viral protease leads to a spectacular structural rearrangement of the viral particle into a mature form that is competent to fuse with and infect a new cell. Maturation involves changes in the structures of protein domains, in the interactions between protein domains, and in the architecture of the viral components that are assembled by the proteins. Tight control of proteolytic cleavages at different sites is required for successful maturation, and the process is a major target of antiretroviral drugs. Here we will describe what is known about the structures of immature and mature retrovirus particles, and about the maturation process by which one transitions into the other. Despite a wealth of available data, fundamental questions about retroviral maturation remain unanswered. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Taschner, Michael J; Rafigh, Mehran; Lampert, Fabienne; Schnaiter, Simon; Hartmann, Christine
2008-05-01
The long bones of vertebrate limbs originate from cartilage templates and are formed by the process of endochondral ossification. This process requires that chondrocytes undergo a progressive maturation from proliferating to postmitotic prehypertrophic to mature, hypertrophic chondrocytes. Coordinated control of proliferation and maturation regulates growth of the skeletal elements. Various signals and pathways have been implicated in orchestrating these processes, but the underlying intracellular molecular mechanisms are often not entirely known. Here we demonstrated in the chick using replication-competent retroviruses that constitutive activation of Calcium/Calmodulin-dependent kinase II (CaMKII) in the developing wing resulted in elongation of skeletal elements associated with premature differentiation of chondrocytes. The premature maturation of chondrocytes was a cell-autonomous effect of constitutive CaMKII signaling associated with down-regulation of cell-cycle regulators and up-regulation of chondrocyte maturation markers. In contrast, the elongation of the skeletal elements resulted from a non-cell autonomous up-regulation of the Indian hedgehog responsive gene encoding Parathyroid-hormone-related peptide. Reduction of endogenous CaMKII activity by overexpressing an inhibitory peptide resulted in shortening of the skeletal elements associated with a delay in chondrocyte maturation. Thus, CaMKII is an essential component of intracellular signaling pathways regulating chondrocyte maturation.
Data Management Applications for the Service Preparation Subsystem
NASA Technical Reports Server (NTRS)
Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.;
2009-01-01
These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.
Content and ratings of mature-rated video games.
Thompson, Kimberly M; Tepichin, Karen; Haninger, Kevin
2006-04-01
To quantify the depiction of violence, blood, sexual themes, profanity, substances, and gambling in video games rated M (for "mature") and to measure agreement between the content observed and the rating information provided to consumers on the game box by the Entertainment Software Rating Board. We created a database of M-rated video game titles, selected a random sample, recorded at least 1 hour of game play, quantitatively assessed the content, performed statistical analyses to describe the content, and compared our observations with the Entertainment Software Rating Board content descriptors and results of our prior studies. Harvard University, Boston, Mass. Authors and 1 hired game player. M-rated video games. Percentages of game play depicting violence, blood, sexual themes, gambling, alcohol, tobacco, or other drugs; use of profanity in dialogue, song lyrics, or gestures. Although the Entertainment Software Rating Board content descriptors for violence and blood provide a good indication of such content in the game, we identified 45 observations of content that could warrant a content descriptor in 29 games (81%) that lacked these content descriptors. M-rated video games are significantly more likely to contain blood, profanity, and substances; depict more severe injuries to human and nonhuman characters; and have a higher rate of human deaths than video games rated T (for "teen"). Parents and physicians should recognize that popular M-rated video games contain a wide range of unlabeled content and may expose children and adolescents to messages that may negatively influence their perceptions, attitudes, and behaviors.
NASA Astrophysics Data System (ADS)
Gonzalez Lopez, J. B.; Avilés, A.; Baron, T.; Ferreira, P.; Kolobara, B.; Pugh, M. A.; Resco, A.; Trzaskoma, J. P.
2014-06-01
Indico has evolved into the main event organization software, room booking tool and collaboration hub for CERN. The growth in its usage has only accelerated during the past 9 years, and today Indico holds more that 215,000 events and 1,100,000 files. The growth was also substantial in terms of functionalities and improvements. In the last year alone, Indico has matured considerably in 3 key areas: enhanced usability, optimized performance and additional features, especially those related to meeting collaboration. Along the course of 2012, much activity has centred around consolidating all this effort and investment into "version 1.0", recently released in 2013.Version 1.0 brings along new features, such as the Microsoft Exchange calendar synchronization for participants, many new and clean interfaces (badges and poster generation, list of contributions, abstracts, etc) and so forth. But most importantly, it brings a message: Indico is now stable, consolidated and mature after more than 10 years of non-stop development. This message is addressed not only to CERN users but also to the many organisations, in or outside HEP, which have already installed the software, and to others who might soon join this community. In this document, we describe the current state of the art of Indico, and how it was built. This does not mean that the Indico software is complete, far from it! We have plenty of new ideas and projects that we are working on and which we have shared during CHEP 2013.
Automated image analysis of placental villi and syncytial knots in histological sections.
Kidron, Debora; Vainer, Ifat; Fisher, Yael; Sharony, Reuven
2017-05-01
Delayed villous maturation and accelerated villous maturation diagnosed in histologic sections are morphologic manifestations of pathophysiological conditions. The inter-observer agreement among pathologists in assessing these conditions is moderate at best. We investigated whether automated image analysis of placental villi and syncytial knots could improve standardization in diagnosing these conditions. Placentas of antepartum fetal death at or near term were diagnosed as normal, delayed or accelerated villous maturation. Histologic sections of 5 cases per group were photographed at ×10 magnification. Automated image analysis of villi and syncytial knots was performed, using ImageJ public domain software. Analysis of hundreds of histologic images was carried out within minutes on a personal computer, using macro commands. Compared to normal placentas, villi from delayed maturation were larger and fewer, with fewer and smaller syncytial knots. Villi from accelerated maturation were smaller. The data were further analyzed according to horizontal placental zones and groups of villous size. Normal placentas can be discriminated from placentas of delayed or accelerated villous maturation using automated image analysis. Automated image analysis of villi and syncytial knots is not equivalent to interpretation by the human eye. Each method has advantages and disadvantages in assessing the 2-dimensional histologic sections representing the complex, 3-dimensional villous tree. Image analysis of placentas provides quantitative data that might help in standardizing and grading of placentas for diagnostic and research purposes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
Software Formal Inspections Standard
NASA Technical Reports Server (NTRS)
1993-01-01
This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.
Building a Snow Data System on the Apache OODT Open Technology Stack
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Painter, T. H.; Mattmann, C. A.; Hart, A. F.; Ramirez, P.; Zimdars, P.; Bryant, A. C.; Snow Data System Team
2011-12-01
Snow cover and its melt dominate regional climate and hydrology in many of the world's mountainous regions. One-sixth of Earth's population depends on snow- or glacier-melt for water resources. Operationally, seasonal forecasts of snowmelt-generated streamflow are leveraged through empirical relations based on past snowmelt periods. These historical data show that climate is changing, but the changes reduce the reliability of the empirical relations. Therefore optimal future management of snowmelt derived water resources will require explicit physical models driven by remotely sensed snow property data. Toward this goal, the Snow Optics Laboratory at the Jet Propulsion Laboratory has initiated a near real-time processing pipeline to generate and publish post-processed snow data products within a few hours of satellite acquisition. To solve this challenge, a Scientific Data Management and Processing System was required and the JPL Team leveraged an open-source project called Object Oriented Data Technology (OODT). OODT was developed within NASA's Jet Propulsion Laboratory across the last 10 years. OODT has supported various scientific data management and processing projects, providing solutions in the Earth, Planetary, and Medical science fields. It became apparent that the project needed to be opened to a larger audience to foster and promote growth and adoption. OODT was open-sourced at the Apache Software Foundation in November 2010 and has a growing community of users and committers that are constantly improving the software. Leveraging OODT, the JPL Snow Data System (SnowDS) Team was able to install and configure a core Data Management System (DMS) that would download MODIS raw data files and archive the products in a local repository for post processing. The team has since built an online data portal, and an algorithm-processing pipeline using the Apache OODT software as the foundation. We will present the working SnowDS system with its core remote sensing components: the MODIS Snow Covered Area and Grain size model (MODSCAG) and the MODIS Dust Radiative Forcing in Snow (MOD-DRFS). These products will be delivered in near real time to water managers and the broader cryosphere and climate community beginning in Winter 2012. We will then present the challenges and opportunities we see in the future as the SnowDS matures and contributions are made back to the OODT project.
Software Measurement Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.
Characterization of 16S rRNA Processing with Pre-30S Subunit Assembly Intermediates from E. coli.
Smith, Brian A; Gupta, Neha; Denny, Kevin; Culver, Gloria M
2018-06-08
Ribosomal RNA (rRNA) is a major component of ribosomes and is fundamental to the process of translation. In bacteria, 16S rRNA is a component of the small ribosomal subunit and plays a critical role in mRNA decoding. rRNA maturation entails the removal of intervening spacer sequences contained within the pre-rRNA transcript by nucleolytic enzymes. Enzymatic activities involved in maturation of the 5'-end of 16S rRNA have been identified, but those involved in 3'-end maturation of 16S rRNA are more enigmatic. Here, we investigate molecular details of 16S rRNA maturation using purified in vivo-formed small subunit (SSU) assembly intermediates (pre-SSUs) from wild-type Escherichia coli that contain precursor 16S rRNA (17S rRNA). Upon incubation of pre-SSUs with E. coli S100 cell extracts or purified enzymes implicated in 16S rRNA processing, the 17S rRNA is processed into additional intermediates and mature 16S rRNA. These results illustrate that exonucleases RNase R, RNase II, PNPase, and RNase PH can process the 3'-end of pre-SSUs in vitro. However, the endonuclease YbeY did not exhibit nucleolytic activity with pre-SSUs under these conditions. Furthermore, these data demonstrate that multiple pathways facilitate 16S rRNA maturation with pre-SSUs in vitro, with the dominant pathways entailing complete processing of the 5'-end of 17S rRNA prior to 3'-end maturation or partial processing of the 5'-end with concomitant processing of the 3'-end. These results reveal the multifaceted nature of SSU biogenesis and suggest that E. coli may be able to escape inactivation of any one enzyme by using an existing complementary pathway. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Evolution of Big Data and Learning Analytics in American Higher Education
ERIC Educational Resources Information Center
Picciano, Anthony G.
2012-01-01
Data-driven decision making, popularized in the 1980s and 1990s, is evolving into a vastly more sophisticated concept known as big data that relies on software approaches generally referred to as analytics. Big data and analytics for instructional applications are in their infancy and will take a few years to mature, although their presence is…
SDI Software Technology Program Plan Version 1.5
1987-06-01
computer generation of auditory communication of meaningful speech. Most speech synthesizers are based on mathematical models of the human vocal tract, but...oral/ auditory and multimodal communications. Although such state-of-the-art interaction technology has not fully matured, user experience has...superior I pattern matching capabilities and the subliminal intuitive deduction capability. The error performance of humans can be helped by careful
NASA Technical Reports Server (NTRS)
Gallis, Michael A.; LeBeau, Gerald J.; Boyles, Katie A.
2003-01-01
The Direct Simulation Monte Carlo method was used to provide 3-D simulations of the early entry phase of the Shuttle Orbiter. Undamaged and damaged scenarios were modeled to provide calibration points for engineering "bridging function" type of analysis. Currently the simulation technology (software and hardware) are mature enough to allow realistic simulations of three dimensional vehicles.
Global flowfield about the V-22 Tiltrotor Aircraft
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1995-01-01
The Chimera overset grid method is reviewed and discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is discussed. A variety of recent applications of the method is presented. Current limitations of the approach are identified.
The Chimera Method of Simulation for Unsteady Three-Dimensional Viscous Flow
NASA Technical Reports Server (NTRS)
Meakin, Robert L.
1996-01-01
The Chimera overset grid method is reviewed and discussed in the context of a method of solution and analysis of unsteady three-dimensional viscous flows. The state of maturity of the various pieces of support software required to use the approach is discussed. A variety of recent applications of the method is presented. Current limitations of the approach are defined.
Decision Aids Using Heterogeneous Intelligence Analysis
2010-08-20
developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
Bone age maturity assessment using hand-held device
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Gilsanz, Vicente; Liu, Xiaodong; Boechat, M. I.
2004-04-01
Purpose: Assessment of bone maturity is traditionally performed through visual comparison of hand and wrist radiograph with existing reference images in textbooks. Our goal was to develop a digital index based on idealized hand Xray images that can be incorporated in a hand held computer and used for visual assessment of bone age for patients. Material and methods: Due to the large variability in bone maturation in normals, we generated a set of "ideal" images obtained by computer combinations of images from our normal reference data sets. Software for hand-held PDA devices was developed for easy navigation through the set of images and visual selection of matching images. A formula based on our statistical analysis provides the standard deviation from normal based on the chronological age of the patient. The accuracy of the program was compared to traditional interpretation by two radiologists in a double blind reading of 200 normal Caucasian children (100 boys, 100 girls). Results: Strong correlations were present between chronological age and bone age (r > 0.9) with no statistical difference between the digital and traditional assessment methods. Determinations of carpal bone maturity in adolescents was slightly more accurate using the digital system. The users did praise the convenience and effectiveness of the digital Palm Index in clinical practice. Conclusion: An idealized digital Palm Bone Age Index provides a convenient and effective alternative to conventional atlases for the assessment of skeletal maturity.
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
TBC-8, a putative RAB-2 GAP, regulates dense core vesicle maturation in Caenorhabditis elegans.
Hannemann, Mandy; Sasidharan, Nikhil; Hegermann, Jan; Kutscher, Lena M; Koenig, Sabine; Eimer, Stefan
2012-01-01
Dense core vesicles (DCVs) are thought to be generated at the late Golgi apparatus as immature DCVs, which subsequently undergo a maturation process through clathrin-mediated membrane remodeling events. This maturation process is required for efficient processing of neuropeptides within DCVs and for removal of factors that would otherwise interfere with DCV release. Previously, we have shown that the GTPase, RAB-2, and its effector, RIC-19, are involved in DCV maturation in Caenorhabditis elegans motoneurons. In rab-2 mutants, specific cargo is lost from maturing DCVs and missorted into the endosomal/lysosomal degradation route. Cargo loss could be prevented by blocking endosomal delivery. This suggests that RAB-2 is involved in retention of DCV components during the sorting process at the Golgi-endosomal interface. To understand how RAB-2 activity is regulated at the Golgi, we screened for RAB-2-specific GTPase activating proteins (GAPs). We identified a potential RAB-2 GAP, TBC-8, which is exclusively expressed in neurons and which, when depleted, shows similar DCV maturation defects as rab-2 mutants. We could demonstrate that RAB-2 binds to its putative GAP, TBC-8. Interestingly, TBC-8 also binds to the RAB-2 effector, RIC-19. This interaction appears to be conserved as TBC-8 also interacted with the human ortholog of RIC-19, ICA69. Therefore, we propose that a dynamic ON/OFF cycling of RAB-2 at the Golgi induced by the GAP/effector complex is required for proper DCV maturation.
Moral Judgment Maturity of Process and Reactive Schizophrenics.
ERIC Educational Resources Information Center
Herron, William G.; And Others
1983-01-01
Premorbid adjustment, paranoid symptomatology, and orientation were examined as major predictors of moral judgment maturity in 40 schizophrenics. Results suggest the importance of cognitive and social skills in the development of schizophrenics' moral judgment maturity. (Author/RH)
The pivotal role of abscisic acid signaling during transition from seed maturation to germination.
Yan, An; Chen, Zhong
2017-05-01
Seed maturation and germination are two continuous developmental processes that link two distinct generations in spermatophytes; the precise genetic control of these two processes is, therefore, crucially important for the survival of the next generation. Pieces of experimental evidence accumulated so far indicate that a concerted action of endogenous signals and environmental cues is required to govern these processes. Plant hormone abscisic acid (ABA) has been suggested to play a predominant role in directing seed maturation and maintaining seed dormancy under unfavorable environmental conditions until antagonized by gibberellins (GA) and certain environmental cues to allow the commencement of seed germination when environmental conditions are favorable; therefore, the balance of ABA and GA is a major determinant of the timing of seed germination. Due to the advent of new technologies and system biology approaches, molecular studies are beginning to draw a picture of the sophisticated genetic network that drives seed maturation during the past decade, though the picture is still incomplete and many details are missing. In this review, we summarize recent advances in ABA signaling pathway in the regulation of seed maturation as well as the transition from seed maturation to germination, and highlight the importance of system biology approaches in the study of seed maturation.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Thoth: Software for data visualization & statistics
NASA Astrophysics Data System (ADS)
Laher, R. R.
2016-10-01
Thoth is a standalone software application with a graphical user interface for making it easy to query, display, visualize, and analyze tabular data stored in relational databases and data files. From imported data tables, it can create pie charts, bar charts, scatter plots, and many other kinds of data graphs with simple menus and mouse clicks (no programming required), by leveraging the open-source JFreeChart library. It also computes useful table-column data statistics. A mature tool, having underwent development and testing over several years, it is written in the Java computer language, and hence can be run on any computing platform that has a Java Virtual Machine and graphical-display capability. It can be downloaded and used by anyone free of charge, and has general applicability in science, engineering, medical, business, and other fields. Special tools and features for common tasks in astronomy and astrophysical research are included in the software.
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
Induced maturation of human immunodeficiency virus.
Mattei, Simone; Anders, Maria; Konvalinka, Jan; Kräusslich, Hans-Georg; Briggs, John A G; Müller, Barbara
2014-12-01
HIV-1 assembles at the plasma membrane of virus-producing cells as an immature, noninfectious particle. Processing of the Gag and Gag-Pol polyproteins by the viral protease (PR) activates the viral enzymes and results in dramatic structural rearrangements within the virion--termed maturation--that are a prerequisite for infectivity. Despite its fundamental importance for viral replication, little is currently known about the regulation of proteolysis and about the dynamics and structural intermediates of maturation. This is due mainly to the fact that HIV-1 release and maturation occur asynchronously both at the level of individual cells and at the level of particle release from a single cell. Here, we report a method to synchronize HIV-1 proteolysis in vitro based on protease inhibitor (PI) washout from purified immature virions, thereby temporally uncoupling virus assembly and maturation. Drug washout resulted in the induction of proteolysis with cleavage efficiencies correlating with the off-rate of the respective PR-PI complex. Proteolysis of Gag was nearly complete and yielded the correct products with an optimal half-life (t(1/2)) of ~5 h, but viral infectivity was not recovered. Failure to gain infectivity following PI washout may be explained by the observed formation of aberrant viral capsids and/or by pronounced defects in processing of the reverse transcriptase (RT) heterodimer associated with a lack of RT activity. Based on our results, we hypothesize that both the polyprotein processing dynamics and the tight temporal coupling of immature particle assembly and PR activation are essential for correct polyprotein processing and morphological maturation and thus for HIV-1 infectivity. Cleavage of the Gag and Gag-Pol HIV-1 polyproteins into their functional subunits by the viral protease activates the viral enzymes and causes major structural rearrangements essential for HIV-1 infectivity. This proteolytic maturation occurs concomitant with virus release, and investigation of its dynamics is hampered by the fact that virus populations in tissue culture contain particles at all stages of assembly and maturation. Here, we developed an inhibitor washout strategy to synchronize activation of protease in wild-type virus. We demonstrated that nearly complete Gag processing and resolution of the immature virus architecture are accomplished under optimized conditions. Nevertheless, most of the resulting particles displayed irregular morphologies, Gag-Pol processing was not faithfully reconstituted, and infectivity was not recovered. These data show that HIV-1 maturation is sensitive to the dynamics of processing and also that a tight temporal link between virus assembly and PR activation is required for correct polyprotein processing. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Mammalian enamel maturation: Crystallographic changes prior to tooth eruption
Kallistová, Anna; Horáček, Ivan; Šlouf, Miroslav; Skála, Roman; Fridrichová, Michaela
2017-01-01
Using the distal molar of a minipig as a model, we studied changes in the microstructural characteristics of apatite crystallites during enamel maturation (16-23 months of postnatal age), and their effects upon the mechanical properties of the enamel coat. The slow rate of tooth development in a pig model enabled us to reveal essential heterochronies in particular components of the maturation process. The maturation changes began along the enamel-dentine junction (EDJ) of the trigonid, spreading subsequently to the outer layers of the enamel coat to appear at the surface zone with a 2-month delay. Correspondingly, at the distal part of the tooth the timing of maturation processes is delayed by 3-5 month compared to the mesial part of the tooth. The early stage of enamel maturation (16-20 months), when the enamel coat is composed almost exclusively of radial prismatic enamel, is characterized by a gradual increase in crystallite thickness (by a mean monthly increment of 3.8 nm); and an increase in the prism width and thickness of crystals composed of elementary crystallites. The late stage of maturation (the last two months prior to tooth eruption), marked with the rapid appearance of the interprismatic matrix (IPM) during which the crystals densely infill spaces between prisms, is characterized by an abrupt decrease in microstrain and abrupt changes in the micromechanical properties of the enamel: a rapid increase in its ability to resist long-term load and its considerable hardening. The results suggest that in terms of crystallization dynamics the processes characterizing the early and late stage of mammalian enamel maturation represent distinct entities. In regards to common features with enamel formation in the tribosphenic molar we argue that the separation of these processes could be a common apomorphy of mammalian amelogenetic dynamics in general. PMID:28196135
Mammalian enamel maturation: Crystallographic changes prior to tooth eruption.
Kallistová, Anna; Horáček, Ivan; Šlouf, Miroslav; Skála, Roman; Fridrichová, Michaela
2017-01-01
Using the distal molar of a minipig as a model, we studied changes in the microstructural characteristics of apatite crystallites during enamel maturation (16-23 months of postnatal age), and their effects upon the mechanical properties of the enamel coat. The slow rate of tooth development in a pig model enabled us to reveal essential heterochronies in particular components of the maturation process. The maturation changes began along the enamel-dentine junction (EDJ) of the trigonid, spreading subsequently to the outer layers of the enamel coat to appear at the surface zone with a 2-month delay. Correspondingly, at the distal part of the tooth the timing of maturation processes is delayed by 3-5 month compared to the mesial part of the tooth. The early stage of enamel maturation (16-20 months), when the enamel coat is composed almost exclusively of radial prismatic enamel, is characterized by a gradual increase in crystallite thickness (by a mean monthly increment of 3.8 nm); and an increase in the prism width and thickness of crystals composed of elementary crystallites. The late stage of maturation (the last two months prior to tooth eruption), marked with the rapid appearance of the interprismatic matrix (IPM) during which the crystals densely infill spaces between prisms, is characterized by an abrupt decrease in microstrain and abrupt changes in the micromechanical properties of the enamel: a rapid increase in its ability to resist long-term load and its considerable hardening. The results suggest that in terms of crystallization dynamics the processes characterizing the early and late stage of mammalian enamel maturation represent distinct entities. In regards to common features with enamel formation in the tribosphenic molar we argue that the separation of these processes could be a common apomorphy of mammalian amelogenetic dynamics in general.
Han, Soo-Jin; Marshall, Vickie; Barsov, Eugene; Quiñones, Octavio; Ray, Alex; Labo, Nazzarena; Trivett, Matthew; Ott, David; Renne, Rolf
2013-01-01
Kaposi's sarcoma-associated herpesvirus (KSHV) encodes 12 pre-microRNAs that can produce 25 KSHV mature microRNAs. We previously reported single-nucleotide polymorphisms (SNPs) in KSHV-encoded pre-microRNA and mature microRNA sequences from clinical samples (V. Marshall et al., J. Infect. Dis., 195:645–659, 2007). To determine whether microRNA SNPs affect pre-microRNA processing and, ultimately, mature microRNA expression levels, we performed a detailed comparative analysis of (i) mature microRNA expression levels, (ii) in vitro Drosha/Dicer processing, and (iii) RNA-induced silencing complex-dependent targeting of wild-type (wt) and variant microRNA genes. Expression of pairs of wt and variant pre-microRNAs from retroviral vectors and measurement of KSHV mature microRNA expression by real-time reverse transcription-PCR (RT-PCR) revealed differential expression levels that correlated with the presence of specific sequence polymorphisms. Measurement of KSHV mature microRNA expression in a panel of primary effusion lymphoma cell lines by real-time RT-PCR recapitulated some observed expression differences but suggested a more complex relationship between sequence differences and expression of mature microRNA. Furthermore, in vitro maturation assays demonstrated significant SNP-associated changes in Drosha/DGCR8 and/or Dicer processing. These data demonstrate that SNPs within KSHV-encoded pre-microRNAs are associated with differential microRNA expression levels. Given the multiple reports on the involvement of microRNAs in cancer, the biological significance of these phenotypic and genotypic variants merits further studies in patients with KSHV-associated malignancies. PMID:24006441
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
Foundations for Security Aware Software Development Education
2005-11-22
depending on the budget, that support robustness. We discuss the educational customer base, projected lifetime, and complexity of paradigm shift that should...in Honour of Sir Tony Hoar, [6] Cheetham, C. and Ferraiolo, K., "The Systems Security Millenial Perspectives in Computer Science, Engineering...Capability Maturity Model", 21st 2002, 229-246. National Information Systems Security Conference, [15] Schwartz, J., "Object Oriented Extensions to October 5
Malina, Robert M
2014-06-01
Growth, maturation, and development dominate the daily lives of children and adolescents for approximately the first 2 decades of life. Growth and maturation are biological processes, while development is largely a behavioral process. The 3 processes occur simultaneously and interact. They can be influenced by physical activity and also can influence activity, performance, and fitness. Allowing for these potential interactions, 10 questions on growth and maturation that have relevance to physical activity, performance, and fitness are presented. The questions are not mutually exclusive and address several broadly defined topical areas: exercise and growth, body weight status (body mass index, adiposity rebound, "unhealthy weight gain"), movement proficiency (hypothesized barrier, role in obesity), individual differences, tracking, maturity-associated variation in performance, and corresponding variation in physical activity. Central to the discussion of each is the need for a biocultural approach recognizing the interactions of biology and behavior as potential influences on the variables of interest.
NASA Technical Reports Server (NTRS)
King, Ellis; Hart, Jeremy; Odegard, Ryan
2010-01-01
The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.
Indonesian kalkulator of oocytes (IKO): A smart application to determine our biological age
NASA Astrophysics Data System (ADS)
Wiweko, Budi; Narasati, Shabrina; Agung, Prince Gusti; Zesario, Aulia; Wibawa, Yohanes Satrya; Maidarti, Mila; Harzif, Achmad Kemal; Pratama, Gita; Sumapraja, Kanadi; Muharam, Raden; Hestiantoro, Andon
2018-02-01
Background: The use of smartphones and its associated application provides new opportunities for physicians. In current situations, there are still few applications are designed in the field of infertility and Assisted Reproductive Technologies (ART). A study conducted on 1616 subjects proved that AMH (Anti-Mullerian Hormone) could be used to predict a woman's biological age earlier than Follicle-Stimulating Hormone (FSH) and Antral Follicle Count (AFC). In this study, we describe the AMH nomogram that has been developed into a mobile application as "Indonesian Kalculator of Oocytes" (IKO). The software required to create IKO application was the Android 4.0.3 Ice Cream Sandwich and Java Application Development. The hardware specification that needed to develop the IKO apps were a 4.0-inch screen, 512 MB RAM (random-access memory), and CPU (central processing unit) with dual core 1.2 Ghz. The application is built using the Android SDK (Software Development Kit) and Java Application Development. In this application, we can predict the woman's biological age, some mature oocytes, and AMH level. This app is expected to help patients to plan effectively for pregnancy and help the doctor to choose the best intervention for patients who face infertility problems using Assisted Reproductive Technology (ART). IKO application can be downloaded for free on Google PlayStore and Apple Store.
Adapting Ground Penetrating Radar for Non-Destructive In-Situ Root and Tuber Assessment
NASA Astrophysics Data System (ADS)
Teare, B. L.; Hays, D. B.; Delgado, A.; Dobreva, I. D.; Bishop, M. P.; Lacey, R.; Huo, D.; Wang, X.
2017-12-01
Ground penetrating radar (GPR) is a rapidly evolving technology extensively used in geoscience, civil science, archeology, and military, and has become a novel application in agricultural systems. One promising application of GPR is for root and tuber detection and measurement. Current commercial GPR systems have been used for detection of large roots, but few studies have attempted to detect agronomic roots, and even fewer have attempted to measure and quantify the total root mass. The ability to monitor and measure root and tuber mass and architecture in an agricultural setting would have far-reaching effects. A few of these include the potential for breeding higher yielding root and tuber crops, rapid bulking roots, discovery of crops with greater carbon sequestration, discovery of plant varieties which have greater ability to stabilize slopes against erosion and slope failure, and drought tolerant varieties. Despite the possible benefits and the current maturity of GPR technology, several challenges remain in the attempt to optimize its use for root and tuber detection. These challenges center on three categories: spatial resolution, data processing, and field-deployable hardware configuration. This study is centered around tuber measurement and its objectives are to i) identify ideal antenna array configurations, frequency, and pulse density; ii) develop novel processing techniques which leverage powerful computer technologies to provide highly accurate measurements of detected features; and iii) develop a cart system which is appropriate for agricultural fields and non-destructive sampling. Already, a 2 GHz multiarray antenna has been identified as an optimal system for tuber detection. Software and processing algorithm development is ongoing, but has already shown improvement over current software offerings. Recent field activity suggest that carts should be width adjustable and sport independent suspension systems to maintain antenna orientation.
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
System Maturity and Architecture Assessment Methods, Processes, and Tools
2012-03-02
Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering Research Center Final Technical...Ramirez- Marquez, D. Nowicki, A. Deshmukh , and M. Sarfaraz. Development of Systems Engineering Maturity Models and Management Tools. Systems Engineering
New definitions for cotton fiber maturity ratio
USDA-ARS?s Scientific Manuscript database
Cotton fiber maturity affects fiber physical, mechanical, and chemical properties, as well as the processability and qualities of yarn and fabrics. New definitions of cotton fiber maturity ratio are introduced. The influences of sampling, sample preparation, measurement method, and correlations am...
Mansouri, Mohammadreza; Ramezani, Farshid; Moghimi, Sasan; Tabatabaie, Ali; Abdi, Fatemeh; He, Mingguang; Lin, Shan C
2014-10-21
To describe anterior segment optical coherence tomography (AS-OCT) parameters in phacomorphic angle closure eyes, mature cataract eyes, and their fellow eyes, and identify those parameters that could be used to differentiate phacomorphic angle closure eyes from those with mature cataract and no phacomorphic angle closure. In this cross-sectional study, a total of 33 phacomorphic angle closure subjects and 34 control patients with unilateral mature cataracts were enrolled. All patients underwent AS-OCT imaging and A-scan biometry of both eyes. Anterior chamber depth (ACD), anterior chamber area (ACA), iris thickness, iris curvature, lens vault (LV), and angle parameters, including angle opening distance (AOD750) and trabecular-iris space area (TISA750), were measured in qualified images using customized software and compared among eyes with phacomorphic angle closure, mature cataract eyes, and their fellow eyes. There was no significant difference in axial length among the four groups. Phacomorphic angle closure had the smallest angle (AOD750, TISA750) and anterior chamber parameters (ACD, ACA, anterior chamber width) and the greatest LV among the groups. This pattern was similar when comparing fellow eyes of mature cataract patients and fellow eyes of phacomorphic angle closure. Anterior chamber area less than 18.62 mm(2), ACD less than 2.60 mm, LV greater than 532.0 μm, and AOD750 less than 0.218 mm had the highest odds ratios (ORs) for distinguishing fellow eyes of phacomorphic angle closure versus fellow eyes of mature cataracts, with OR values of 9.90, 8.31, 7.91, and 7.91, respectively. Logistic regression showed that ACA less than 18.62 was the major parameter associated with fellow eyes of phacomorphic angle closure (OR = 10.96, P < 0.001). Anterior chamber depth, ACA, AOD750, and LV are powerful indicators in differentiating phacomorphic angle closure eyes from those with mature cataract and their fellow eyes. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.
The Roles of Family B and D DNA Polymerases in Thermococcus Species 9°N Okazaki Fragment Maturation*
Greenough, Lucia; Kelman, Zvi; Gardner, Andrew F.
2015-01-01
During replication, Okazaki fragment maturation is a fundamental process that joins discontinuously synthesized DNA fragments into a contiguous lagging strand. Efficient maturation prevents repeat sequence expansions, small duplications, and generation of double-stranded DNA breaks. To address the components required for the process in Thermococcus, Okazaki fragment maturation was reconstituted in vitro using purified proteins from Thermococcus species 9°N or cell extracts. A dual color fluorescence assay was developed to monitor reaction substrates, intermediates, and products. DNA polymerase D (polD) was proposed to function as the replicative polymerase in Thermococcus replicating both the leading and the lagging strands. It is shown here, however, that it stops before the previous Okazaki fragments, failing to rapidly process them. Instead, Family B DNA polymerase (polB) was observed to rapidly fill the gaps left by polD and displaces the downstream Okazaki fragment to create a flap structure. This flap structure was cleaved by flap endonuclease 1 (Fen1) and the resultant nick was ligated by DNA ligase to form a mature lagging strand. The similarities to both bacterial and eukaryotic systems and evolutionary implications of archaeal Okazaki fragment maturation are discussed. PMID:25814667
Hequet, O; Le, Q H; Rodriguez, J; Dubost, P; Revesz, D; Clerc, A; Rigal, D; Salles, G; Coiffier, B
2014-04-01
Hematopoietic stem cells (HSCs) required to perform peripheral hematopoietic autologous stem cell transplantation (APBSCT) can be collected by processing several blood volumes (BVs) in leukapheresis sessions. However, this may cause granulocyte harvest in graft and decrease in patient's platelet blood level. Both consequences may induce disturbances in patient. One apheresis team's current purpose is to improve HSC collection by increasing HSC collection and prevent increase in granulocyte and platelet harvests. Before improving HSC collection it seemed important to know more about the way to harvest these types of cells. The purpose of our study was to develop a simple model for analysing respective collections of intended CD34+ cells among HSC (designated here as HSC) and harvests of unintended platelets or granulocytes among mature cells (designated here as mature cells) considering the number of BVs processed and factors likely to influence cell collection or harvest. For this, we processed 1, 2 and 3 BVs in 59 leukapheresis sessions and analysed corresponding collections and harvests with a referent device (COBE Spectra). First we analysed the amounts of HSC collected and mature cells harvested and second the evolution of the respective shares of HSC and mature cells collected or harvested throughout the BV processes. HSC collections and mature cell harvests increased globally (p<0.0001) and their respective shares remained stable throughout the BV processes (p non-significant). We analysed the role of intrinsic (patient's features) and extrinsic (features before starting leukapheresis sessions) factors in collections and harvests, which showed that only pre-leukapheresis blood levels (CD34+cells and platelets) influenced both cell collections and harvests (CD34+cells and platelets) (p<0.001) and shares of HSC collections and mature unintended cells harvests (p<0.001) throughout the BV processes. Altogether, our results suggested that the main factors likely to influence intended HSC collections or unintended mature cell harvests were pre-leukapheresis blood cell levels. Our model was meant to assist apheresis teams in analysing shares of HSC collected and mature cells harvested with new devices or with new types of HSC mobilization. Copyright © 2014 Elsevier Ltd. All rights reserved.
Proteomic analysis of 'Zaosu' pear (Pyrus bretschneideri Rehd.) and its early-maturing bud sport.
Liu, Xueting; Zhai, Rui; Feng, Wenting; Zhang, Shiwei; Wang, Zhigang; Qiu, Zonghao; Zhang, Junke; Ma, Fengwang; Xu, Lingfei
2014-07-01
Maturation of fruits involves a series of physiological, biochemical, and organoleptic changes that eventually make fleshy fruits attractive, palatable, and nutritional. In order to understand the mature mechanism of the early-maturing bud sport of 'Zaosu' pear, we analyzed the differences of proteome expression between the both pears in different mature stages by the methods of a combination of two-dimensional electrophoresis (2-DE) and matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) analysis. Seventy-five differential expressed protein spots (p<0.05) were obtained between 'Zaosu' pear and its early-maturing bud sport, but only sixty-eight were demonstratively identified in the database of NCBI and uniprot. The majority of proteins were linked to metabolism, energy, stress response/defense and cell structure. Additionally, our data confirmed an increase of proteins related to cell-wall modification, oxidative stress and pentose phosphate metabolism and a decrease of proteins related to photosynthesis and glycolysis during the development process of both pears, but all these proteins increased or decreased faster in the early-maturing bud sport. This comparative analysis between both pears showed that these proteins were closely associated with maturation and could provide more detailed characteristics of the maturation process of both pears. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Structural Maturation of HIV-1 Reverse Transcriptase—A Metamorphic Solution to Genomic Instability
London, Robert E.
2016-01-01
Human immunodeficiency virus 1 (HIV-1) reverse transcriptase (RT)—a critical enzyme of the viral life cycle—undergoes a complex maturation process, required so that a pair of p66 precursor proteins can develop conformationally along different pathways, one evolving to form active polymerase and ribonuclease H (RH) domains, while the second forms a non-functional polymerase and a proteolyzed RH domain. These parallel maturation pathways rely on the structural ambiguity of a metamorphic polymerase domain, for which the sequence–structure relationship is not unique. Recent nuclear magnetic resonance (NMR) studies utilizing selective labeling techniques, and structural characterization of the p66 monomer precursor have provided important insights into the details of this maturation pathway, revealing many aspects of the three major steps involved: (1) domain rearrangement; (2) dimerization; and (3) subunit-selective RH domain proteolysis. This review summarizes the major structural changes that occur during the maturation process. We also highlight how mutations, often viewed within the context of the mature RT heterodimer, can exert a major influence on maturation and dimerization. It is further suggested that several steps in the RT maturation pathway may provide attractive targets for drug development. PMID:27690082
TBC-8, a Putative RAB-2 GAP, Regulates Dense Core Vesicle Maturation in Caenorhabditis elegans
Hannemann, Mandy; Sasidharan, Nikhil; Hegermann, Jan; Kutscher, Lena M.; Koenig, Sabine; Eimer, Stefan
2012-01-01
Dense core vesicles (DCVs) are thought to be generated at the late Golgi apparatus as immature DCVs, which subsequently undergo a maturation process through clathrin-mediated membrane remodeling events. This maturation process is required for efficient processing of neuropeptides within DCVs and for removal of factors that would otherwise interfere with DCV release. Previously, we have shown that the GTPase, RAB-2, and its effector, RIC-19, are involved in DCV maturation in Caenorhabditis elegans motoneurons. In rab-2 mutants, specific cargo is lost from maturing DCVs and missorted into the endosomal/lysosomal degradation route. Cargo loss could be prevented by blocking endosomal delivery. This suggests that RAB-2 is involved in retention of DCV components during the sorting process at the Golgi-endosomal interface. To understand how RAB-2 activity is regulated at the Golgi, we screened for RAB-2–specific GTPase activating proteins (GAPs). We identified a potential RAB-2 GAP, TBC-8, which is exclusively expressed in neurons and which, when depleted, shows similar DCV maturation defects as rab-2 mutants. We could demonstrate that RAB-2 binds to its putative GAP, TBC-8. Interestingly, TBC-8 also binds to the RAB-2 effector, RIC-19. This interaction appears to be conserved as TBC-8 also interacted with the human ortholog of RIC-19, ICA69. Therefore, we propose that a dynamic ON/OFF cycling of RAB-2 at the Golgi induced by the GAP/effector complex is required for proper DCV maturation. PMID:22654674
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
Lipid raft dynamics linked to sperm competency for fertilization in mice.
Watanabe, Hitomi; Takeda, Rie; Hirota, Keiji; Kondoh, Gen
2017-05-01
It is well known that mammalian sperm acquires fertilization ability after several maturation processes, particularly within the female reproductive tract. In a previous study, we found that both glycosylphosphatidylinositol (GPI)-anchored protein (GPI-AP) release and lipid raft movement occur during the sperm maturation process. In several genetic studies, release of GPI-AP is a crucial step for sperm fertilization ability in the mouse. Here, we show that lipid raft movement is also fundamental for sperm to be competent for fertilization by comparing the sperm maturation process of two mouse inbred strains, C57BL/6 and BALB/c. We found that ganglioside GM1 movement was exclusively reduced in BALB/c compared with C57BL/6 among other examined sperm maturation parameters, such as GPI-AP release, sperm migration to the oviduct, cholesterol efflux, protein tyrosine phosphorylation and acrosome reaction, and was strongly linked to sperm fertility phenotype. The relationship between GM1 movement and in vitro fertilization ability was confirmed in other mouse strains, suggesting that lipid raft movement is one of the important steps for completing the sperm maturation process. © 2017 Molecular Biology Society of Japan and John Wiley & Sons Australia, Ltd.
David, Cristiana; Peride, Ileana; Niculae, Andrei; Constantin, Alexandra Maria; Checherita, Ionel Alexandru
2016-09-20
Native arteriovenous fistula (AVF) is the most appropriate type of vascular access for chronic dialysis. Its patency rates depend on vascular wall characteristics. Ketoacid analogues of essential amino acids (KA/EAA) are prescribed in end-stage renal disease (ESRD) pre-dialysis patients to lower toxic metabolic products generation and improve nutritional status. We hypothesized that very-low protein diet (VLPD) supplemented with KA/EAA may influence arterial wall stiffness and affect AVF maturation rates and duration in pre-dialysis ESRD patients. In a prospective, cohort, 3 years study we enrolled 67 consecutive non-diabetic early referral ESRD patients that underwent AVF creation in our hospital. Patients were divided in two groups based on their regimen 12 months prior to surgery: a VLPD supplemented with KA/EAA study group versus a low protein diet non-KA/EAA-supplemented control group. For each patient we performed serum analysis for the parameters of bone mineral disease, inflammation and nutritional status, one pulse wave velocity (PWV) measurement and one Doppler ultrasound (US) determination prior the surgery, followed by consequent Doppler US assessments at 4, 6, 8 and 12 weeks after it. Rates and duration of mature AVF achievement were noted. We used logistic regression to analyze the association between AVF maturation and KA/EAA administration, by comparing rates and durations between groups, unadjusted and adjusted for systolic blood pressure, C-reactive protein, PWV, phosphorus values. All parameters in the logistic model were transformed in binary variables. A p-value < α = 0.05 was considered significant; data were processed using SPSS 16 software and Excel. In the study group (n = 28, aged 57 ± 12.35, 13 females) we registered better serum phosphate (p = 0.022) and C-reactive protein control (p = 0.021), lower PWV (p = 0.007) and a higher percent of AVF creation success (33.3 % versus 17.8 %, p < 0.05). AVF maturation duration was lower in study group (5.91 versus 7.15 weeks, p < 0.001). VLPD supplemented with KA/EAA appear to improve the native AVF primary outcome, decreasing the initial vascular stiffness, possible by preserving vascular wall quality in CKD patients through a better serum phosphate levels control and the limitation of inflammatory response.
Heterogeneous computing architecture for fast detection of SNP-SNP interactions.
Sluga, Davor; Curk, Tomaz; Zupan, Blaz; Lotric, Uros
2014-06-25
The extent of data in a typical genome-wide association study (GWAS) poses considerable computational challenges to software tools for gene-gene interaction discovery. Exhaustive evaluation of all interactions among hundreds of thousands to millions of single nucleotide polymorphisms (SNPs) may require weeks or even months of computation. Massively parallel hardware within a modern Graphic Processing Unit (GPU) and Many Integrated Core (MIC) coprocessors can shorten the run time considerably. While the utility of GPU-based implementations in bioinformatics has been well studied, MIC architecture has been introduced only recently and may provide a number of comparative advantages that have yet to be explored and tested. We have developed a heterogeneous, GPU and Intel MIC-accelerated software module for SNP-SNP interaction discovery to replace the previously single-threaded computational core in the interactive web-based data exploration program SNPsyn. We report on differences between these two modern massively parallel architectures and their software environments. Their utility resulted in an order of magnitude shorter execution times when compared to the single-threaded CPU implementation. GPU implementation on a single Nvidia Tesla K20 runs twice as fast as that for the MIC architecture-based Xeon Phi P5110 coprocessor, but also requires considerably more programming effort. General purpose GPUs are a mature platform with large amounts of computing power capable of tackling inherently parallel problems, but can prove demanding for the programmer. On the other hand the new MIC architecture, albeit lacking in performance reduces the programming effort and makes it up with a more general architecture suitable for a wider range of problems.
Heterogeneous computing architecture for fast detection of SNP-SNP interactions
2014-01-01
Background The extent of data in a typical genome-wide association study (GWAS) poses considerable computational challenges to software tools for gene-gene interaction discovery. Exhaustive evaluation of all interactions among hundreds of thousands to millions of single nucleotide polymorphisms (SNPs) may require weeks or even months of computation. Massively parallel hardware within a modern Graphic Processing Unit (GPU) and Many Integrated Core (MIC) coprocessors can shorten the run time considerably. While the utility of GPU-based implementations in bioinformatics has been well studied, MIC architecture has been introduced only recently and may provide a number of comparative advantages that have yet to be explored and tested. Results We have developed a heterogeneous, GPU and Intel MIC-accelerated software module for SNP-SNP interaction discovery to replace the previously single-threaded computational core in the interactive web-based data exploration program SNPsyn. We report on differences between these two modern massively parallel architectures and their software environments. Their utility resulted in an order of magnitude shorter execution times when compared to the single-threaded CPU implementation. GPU implementation on a single Nvidia Tesla K20 runs twice as fast as that for the MIC architecture-based Xeon Phi P5110 coprocessor, but also requires considerably more programming effort. Conclusions General purpose GPUs are a mature platform with large amounts of computing power capable of tackling inherently parallel problems, but can prove demanding for the programmer. On the other hand the new MIC architecture, albeit lacking in performance reduces the programming effort and makes it up with a more general architecture suitable for a wider range of problems. PMID:24964802
Design Criteria For Networked Image Analysis System
NASA Astrophysics Data System (ADS)
Reader, Cliff; Nitteberg, Alan
1982-01-01
Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.
A case study of exploiting enterprise resource planning requirements
NASA Astrophysics Data System (ADS)
Niu, Nan; Jin, Mingzhou; Cheng, Jing-Ru C.
2011-05-01
The requirements engineering (RE) processes have become a key to conceptualising corporate-wide integrated solutions based on packaged enterprise resource planning (ERP) software. The RE literature has mainly focused on procuring the most suitable ERP package. Little is known about how an organisation exploits the chosen ERP RE model to frame the business application development. This article reports an exploratory case study of a key tenet of ERP RE adoption, namely that aligning business applications to the packaged RE model leads to integral practices and economic development. The case study analysed a series interrelated pilot projects developed for a business division of a large IT manufacturing and service company, using Oracle's appl1ication implementation method (AIM). The study indicated that AIM RE improved team collaboration and project management experience, but needed to make hidden assumptions explicit to support data visibility and integrity. Our study can direct researchers towards rigorous empirical evaluations of ERP RE adoption, collect experiences and lessons learned for practitioners, and help generate more effective and mature processes when exploiting ERP RE methods.
Potential Collaborative Research topics with Korea’s Agency for Defense Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R.; Todd, Michael D.
2012-08-23
This presentation provides a high level summary of current research activities at the Los Alamos National Laboratory (LANL)-University of California Jacobs School of Engineering (UCSD) Engineering Institute that will be presented at Korea's Agency for Defense Development (ADD). These research activities are at the basic engineering science level with different level of maturity ranging from initial concepts to field proof-of-concept demonstrations. We believe that all of these activities are appropriate for collaborative research activities with ADD subject to approval by each institution. All the activities summarized herein have the common theme that they are multi-disciplinary in nature and typically involvedmore » the integration of high-fidelity predictive modeling, advanced sensing technologies and new development in information technology. These activities include: Wireless Sensor Systems, Swarming Robot sensor systems, Advanced signal processing (compressed sensing) and pattern recognition, Model Verification and Validation, Optimal/robust sensor system design, Haptic systems for large-scale data processing, Cyber-physical security for robots, Multi-source energy harvesting, Reliability-based approaches to damage prognosis, SHMTools software development, and Cyber-physical systems advanced study institute.« less
MicROS-drt: supporting real-time and scalable data distribution in distributed robotic systems.
Ding, Bo; Wang, Huaimin; Fan, Zedong; Zhang, Pengfei; Liu, Hui
A primary requirement in distributed robotic software systems is the dissemination of data to all interested collaborative entities in a timely and scalable manner. However, providing such a service in a highly dynamic and resource-limited robotic environment is a challenging task, and existing robot software infrastructure has limitations in this aspect. This paper presents a novel robot software infrastructure, micROS-drt, which supports real-time and scalable data distribution. The solution is based on a loosely coupled data publish-subscribe model with the ability to support various time-related constraints. And to realize this model, a mature data distribution standard, the data distribution service for real-time systems (DDS), is adopted as the foundation of the transport layer of this software infrastructure. By elaborately adapting and encapsulating the capability of the underlying DDS middleware, micROS-drt can meet the requirement of real-time and scalable data distribution in distributed robotic systems. Evaluation results in terms of scalability, latency jitter and transport priority as well as the experiment on real robots validate the effectiveness of this work.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
Molecular changes and signaling events occurring in sperm during epididymal maturation
Gervasi, Maria Gracia; Visconti, Pablo E.
2017-01-01
After leaving the testis, sperm have not yet acquired the ability to move progressively and are unable to fertilize oocytes. To become fertilization-competent they must go through an epididymal maturation process in the male, and capacitation in the female tract. Epididymal maturation can be defined as those changes occurring to sperm in the epididymis that render the sperm the ability to capacitate in the female tract. As part of this process, sperm cells undergo a series of biochemical and physiological changes that require incorporation of new molecules derived from the epididymal epithelium, as well as post-translational modifications of endogenous proteins synthesized during spermiogenesis in the testis. This review will focus on epididymal maturation events, with emphasis in recent advances in the understanding of the molecular basis of this process. PMID:28297559
Toward the Decision Tree for Inferring Requirements Maturation Types
NASA Astrophysics Data System (ADS)
Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi
Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.
Heinz, Andrea; Ruttkies, Christoph K H; Jahreis, Günther; Schräder, Christoph U; Wichapong, Kanin; Sippl, Wolfgang; Keeley, Fred W; Neubert, Reinhard H H; Schmelzer, Christian E H
2013-04-01
Elastin is a vital protein and the major component of elastic fibers which provides resilience to many vertebrate tissues. Elastin's structure and function are influenced by extensive cross-linking, however, the cross-linking pattern is still unknown. Small peptides containing reactive allysine residues based on sequences of cross-linking domains of human elastin were incubated in vitro to form cross-links characteristic of mature elastin. The resultant insoluble polymeric biomaterials were studied by scanning electron microscopy. Both, the supernatants of the samples and the insoluble polymers, after digestion with pancreatic elastase or trypsin, were furthermore comprehensively characterized on the molecular level using MALDI-TOF/TOF mass spectrometry. MS(2) data was used to develop the software PolyLinX, which is able to sequence not only linear and bifunctionally cross-linked peptides, but for the first time also tri- and tetrafunctionally cross-linked species. Thus, it was possible to identify intra- and intermolecular cross-links including allysine aldols, dehydrolysinonorleucines and dehydromerodesmosines. The formation of the tetrafunctional cross-link desmosine or isodesmosine was unexpected, however, could be confirmed by tandem mass spectrometry and molecular dynamics simulations. The study demonstrated that it is possible to produce biopolymers containing polyfunctional cross-links characteristic of mature elastin from small elastin peptides. MALDI-TOF/TOF mass spectrometry and the newly developed software PolyLinX proved suitable for sequencing of native cross-links in proteolytic digests of elastin-like biomaterials. The study provides important insight into the formation of native elastin cross-links and represents a considerable step towards the characterization of the complex cross-linking pattern of mature elastin. Copyright © 2013 Elsevier B.V. All rights reserved.
Certification Processes for Safety-Critical and Mission-Critical Aerospace Software
NASA Technical Reports Server (NTRS)
Nelson, Stacy
2003-01-01
This document is a quick reference guide with an overview of the processes required to certify safety-critical and mission-critical flight software at selected NASA centers and the FAA. Researchers and software developers can use this guide to jumpstart their understanding of how to get new or enhanced software onboard an aircraft or spacecraft. The introduction contains aerospace industry definitions of safety and safety-critical software, as well as, the current rationale for certification of safety-critical software. The Standards for Safety-Critical Aerospace Software section lists and describes current standards including NASA standards and RTCA DO-178B. The Mission-Critical versus Safety-Critical software section explains the difference between two important classes of software: safety-critical software involving the potential for loss of life due to software failure and mission-critical software involving the potential for aborting a mission due to software failure. The DO-178B Safety-critical Certification Requirements section describes special processes and methods required to obtain a safety-critical certification for aerospace software flying on vehicles under auspices of the FAA. The final two sections give an overview of the certification process used at Dryden Flight Research Center and the approval process at the Jet Propulsion Lab (JPL).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... modify its Rules as they relate to the Issuing/Paying Agent's (``IPA's'') refusal to pay process. DTC is proposing not to process a reversal of a transaction initiated by an IPA when issuances of Money Market... morning for MMIs maturing that day. The automatic process electronically sweeps all maturing positions of...
[Maturity Levels of Quality and Risk Management at the University Hospital Schleswig-Holstein].
Jussli-Melchers, Jill; Hilbert, Carsten; Jahnke, Iris; Wehkamp, Kai; Rogge, Annette; Freitag-Wolf, Sandra; Kahla-Witzsch, Heike A; Scholz, Jens; Petzina, Rainer
2018-05-16
Quality and risk management in hospitals are not only required by law but also for an optimal patient-centered and process-optimized patient care. To evaluate the maturity levels of quality and risk management at the University Hospital Schleswig-Holstein (UKSH), a structured analytical tool was developed for easy and efficient application. Four criteria concerning quality management - quality assurance (QS), critical incident reporting system (CIRS), complaint management (BM) and process management (PM) - were evaluated with a structured questionnaire. Self-assessment and external assessment were performed to classify the maturity levels at the UKSH (location Kiel and Lübeck). Every quality item was graded into four categories from "A" (fully implemented) to "D" (not implemented at all). First of all, an external assessment was initiated by the head of the department of quality and risk management. Thereafter, a self-assessment was performed by 46 clinical units of the UKSH. Discrepancies were resolved in a collegial dialogue. Based on these data, overall maturity levels were obtained for every clinical unit. The overall maturity level "A" was reached by three out of 46 (6.5%) clinical units. No unit was graded with maturity level "D". 50% out of all units reached level "B" and 43.5% level "C". The distribution of the four different quality criteria revealed a good implementation of complaint management (maturity levels "A" and "B" in 78.3%), whereas the levels for CIRS were "C" and "D" in 73.9%. Quality assurance and process management showed quite similar distributions for the levels of maturity "B" and "C" (87% QS; 91% PM). The structured analytical tool revealed maturity levels of 46 clinical units of the UKSH and defined the maturity levels of four relevant quality criteria (QS, CIRS, BM, PM). As a consequence, extensive procedures were implemented to raise the standard of quality and risk management. In future, maturity levels will be reevaluated every two years. This qualitative maturity level model enables in a simple and efficient way precise statements concerning presence, manifestation and development of quality and risk management. © Georg Thieme Verlag KG Stuttgart · New York.
Viewing pre-60S maturation at a minute’s timescale
Zisser, Gertrude; Ohmayer, Uli; Mauerhofer, Christina; Mitterer, Valentin; Klein, Isabella; Rechberger, Gerald N; Wolinski, Heimo; Prattes, Michael; Pertschy, Brigitte; Milkereit, Philipp
2018-01-01
Abstract The formation of ribosomal subunits is a highly dynamic process that is initiated in the nucleus and involves more than 200 trans-acting factors, some of which accompany the pre-ribosomes into the cytoplasm and have to be recycled into the nucleus. The inhibitor diazaborine prevents cytoplasmic release and recycling of shuttling pre-60S maturation factors by inhibiting the AAA-ATPase Drg1. The failure to recycle these proteins results in their depletion in the nucleolus and halts the pathway at an early maturation step. Here, we made use of the fast onset of inhibition by diazaborine to chase the maturation path in real-time from 27SA2 pre-rRNA containing pre-ribosomes localized in the nucleolus up to nearly mature 60S subunits shortly after their export into the cytoplasm. This allows for the first time to put protein assembly and disassembly reactions as well as pre-rRNA processing into a chronological context unraveling temporal and functional linkages during ribosome maturation. PMID:29294095
Link, William; Hesed, Kyle Miller
2015-01-01
Knowledge of organisms’ growth rates and ages at sexual maturity is important for conservation efforts and a wide variety of studies in ecology and evolutionary biology. However, these life history parameters may be difficult to obtain from natural populations: individuals encountered may be of unknown age, information on age at sexual maturity may be uncertain and interval-censored, and growth data may include both individual heterogeneity and measurement errors. We analyzed mark–recapture data for Red-backed Salamanders (Plethodon cinereus) to compare sex-specific growth rates and ages at sexual maturity. Aging of individuals was made possible by the use of a von Bertalanffy model of growth, complemented with models for interval-censored and imperfect observations at sexual maturation. Individual heterogeneity in growth was modeled through the use of Gamma processes. Our analysis indicates that female P. cinereus mature earlier and grow more quickly than males, growing to nearly identical asymptotic size distributions as males.
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
Genomics of Mature and Immature Olfactory Sensory Neurons
Nickell, Melissa D.; Breheny, Patrick; Stromberg, Arnold J.; McClintock, Timothy S.
2014-01-01
The continuous replacement of neurons in the olfactory epithelium provides an advantageous model for investigating neuronal differentiation and maturation. By calculating the relative enrichment of every mRNA detected in samples of mature mouse olfactory sensory neurons (OSNs), immature OSNs, and the residual population of neighboring cell types, and then comparing these ratios against the known expression patterns of >300 genes, enrichment criteria that accurately predicted the OSN expression patterns of nearly all genes were determined. We identified 847 immature OSN-specific and 691 mature OSN-specific genes. The control of gene expression by chromatin modification and transcription factors, and neurite growth, protein transport, RNA processing, cholesterol biosynthesis, and apoptosis via death domain receptors, were overrepresented biological processes in immature OSNs. Ion transport (ion channels), presynaptic functions, and cilia-specific processes were overrepresented in mature OSNs. Processes overrepresented among the genes expressed by all OSNs were protein and ion transport, ER overload response, protein catabolism, and the electron transport chain. To more accurately represent gradations in mRNA abundance and identify all genes expressed in each cell type, classification methods were used to produce probabilities of expression in each cell type for every gene. These probabilities, which identified 9,300 genes expressed in OSNs, were 96% accurate at identifying genes expressed in OSNs and 86% accurate at discriminating genes specific to mature and immature OSNs. This OSN gene database not only predicts the genes responsible for the major biological processes active in OSNs, but also identifies thousands of never before studied genes that support OSN phenotypes. PMID:22252456
Age-dependent change in urine proteome of healthy individuals
NASA Astrophysics Data System (ADS)
Dobrokhotov, Igor; Liudmila Pastushkova, MRS.; Larina, Irina; Kononikhin, Alexey
It was analyzed the protein composition of urine samples obtained from twenty Russian cosmonauts and thirty-eight healthy volunteers, that have been selected for the experiments simulating the physiological effects of microgravity. The special sample preparation was performed, followed by liquid chromatography-mass spectrometry. Liquid chromatography-mass spectrometry of the minor proteins was performed on a nano-HPLC Agilent 1100 system (Agilent Technologies Inc., USA) in combination with a LTQ-FT Ultra mass spectrometer (Thermo Electron, Germany). List of masses derived peptides and they fragments have used for search and identification of proteins by database IPI-human (international index of protein) using the program Mascot (MS version 2.0.04 , UK) according to the following criteria: 1 - enzyme-trypsin; 2 - peptide tol. ± 5 ppm; 3 - MS / MS tol. 0.5Da. From list of proteins obtained as a result Mascot-search it was selected only those proteins that were identified based on 2 or more peptides with the rating more than 24. Analysis of the list of proteins was performed using software developed in the laboratory of VA Ivanisenko (ICG SB RAS) Age of healthy individuals was ranged from 18 to 54 years. Depending on the age, the data were divided into three groups: those relating to the group of persons under 25 years (youth and mature age 1), 25-40 years (mature age 2) and 40-54 years (mature age 3) It was detected reliable changes in the number of proteins among groups depending of the age. It was found that the minimum number of different proteins were detected in the urine of the group of young patients (under 25 years old) , and the maximum - was observed in the group of middle-aged persons (25 to 40 years). When the proteins were compared according to their molecular mass it was revealed that in the older group (40-54 years ) there is noticeably smaller percentage of high molecular weight proteins than in groups of young and middle aged persons. Thus, proteomic studies of urine samples can detect reliable age differences between groups of subjects, reflecting both age-specific features of processes of protein reabsorption by the kidneys, and the aging process in general.
Autonomous Rendezvous and Docking Conference, volume 1
NASA Technical Reports Server (NTRS)
1990-01-01
This document consists of the presentation submitted at the Autonomous Rendezvous and Docking (ARD) Conference. It contains three volumes: ARD hardware technology; ARD software technology; and ARD operations. The purpose of this conference is to identify the technologies required for an on orbit demonstration of the ARD, assess the maturity of these technologies, and provide the necessary insight for a quality assessment of the programmatic management, technical, schedule, and cost risks.
Autonomous Rendezvous and Docking Conference, volume 3
NASA Technical Reports Server (NTRS)
1990-01-01
This document consists of the presentation submitted at the Autonomous Rendezvous and Docking (ARD) Conference. The document contains three volumes: ARD hardware technology; ARD software technology; and ARD operations. The purpose of this conference is to identify the technologies required for an on orbit demonstration of ARD, assess the maturity of these technologies, and provide the necessary insight for a quality assessment of programmatic management, technical, schedule, and cost risks.
Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations
NASA Astrophysics Data System (ADS)
Schott, Katharina; Beck, Roman; Gregory, Robert Wayne
Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinh, Nam; Athe, Paridhi; Jones, Christopher
The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less
Software Engineering Program: Software Process Improvement Guidebook
NASA Technical Reports Server (NTRS)
1996-01-01
The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Definitions § 51.312 Mature. “Mature” means that the apples have reached the stage of development which will insure the proper completion of the ripening process. Before a mature apple becomes overripe it will show varying degrees of firmness...
Code of Federal Regulations, 2014 CFR
2014-01-01
..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Definitions § 51.312 Mature. “Mature” means that the apples have reached the stage of development which will insure the proper completion of the ripening process. Before a mature apple becomes overripe it will show varying degrees of firmness...
Properties of the [NiFe]-hydrogenase maturation protein HypD.
Blokesch, Melanie; Böck, August
2006-07-24
A mutational screen of amino acid residues of hydrogenase maturation protein HypD from Escherichia coli disclosed that seven conserved cysteine residues located in three different motifs in HypD are essential. Evidence is presented for potential functions of these motifs in the maturation process.
Simultaneous measurements of Cotton fiber maturity, fineness, ribbon width, and micronaire
USDA-ARS?s Scientific Manuscript database
Maturity (degree of secondary wall development) and fineness (linear density) are important cotton quality and processing properties, but their direct measurement is often difficult and/or expensive to perform. An indirect but critical measurement of maturity and fineness is micronaire, which is on...
Boulanger-Weill, Jonathan; Candat, Virginie; Jouary, Adrien; Romano, Sebastián A; Pérez-Schuster, Verónica; Sumbre, Germán
2017-06-19
From development up to adulthood, the vertebrate brain is continuously supplied with newborn neurons that integrate into established mature circuits. However, how this process is coordinated during development remains unclear. Using two-photon imaging, GCaMP5 transgenic zebrafish larvae, and sparse electroporation in the larva's optic tectum, we monitored spontaneous and induced activity of large neuronal populations containing newborn and functionally mature neurons. We observed that the maturation of newborn neurons is a 4-day process. Initially, newborn neurons showed undeveloped dendritic arbors, no neurotransmitter identity, and were unresponsive to visual stimulation, although they displayed spontaneous calcium transients. Later on, newborn-labeled neurons began to respond to visual stimuli but in a very variable manner. At the end of the maturation period, newborn-labeled neurons exhibited visual tuning curves (spatial receptive fields and direction selectivity) and spontaneous correlated activity with neighboring functionally mature neurons. At this developmental stage, newborn-labeled neurons presented complex dendritic arbors and neurotransmitter identity (excitatory or inhibitory). Removal of retinal inputs significantly perturbed the integration of newborn neurons into the functionally mature tectal network. Our results provide a comprehensive description of the maturation of newborn neurons during development and shed light on potential mechanisms underlying their integration into a functionally mature neuronal circuit. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Peter, Varghese; Wong, Kogo; Narne, Vijaya Kumar; Sharma, Mridula; Purdy, Suzanne C; McMahon, Catherine
2014-02-01
There are many clinically available tests for the assessment of auditory processing skills in children and adults. However, there is limited data available on the maturational effects on the performance on these tests. The current study investigated maturational effects on auditory processing abilities using three psychophysical measures: temporal modulation transfer function (TMTF), iterated ripple noise (IRN) perception, and spectral ripple discrimination (SRD). A cross-sectional study. Three groups of subjects were tested: 10 adults (18-30 yr), 10 older children (12-18 yr), and 10 young children (8-11 yr) Temporal envelope processing was measured by obtaining thresholds for amplitude modulation detection as a function of modulation frequency (TMTF; 4, 8, 16, 32, 64, and 128 Hz). Temporal fine structure processing was measured using IRN, and spectral processing was measured using SRD. The results showed that young children had significantly higher modulation thresholds at 4 Hz (TMTF) compared to the other two groups and poorer SRD scores compared to adults. The results on IRN did not differ across groups. The results suggest that different aspects of auditory processing mature at different age periods and these maturational effects need to be considered while assessing auditory processing in children. American Academy of Audiology.
[Establishment of cervical vertebral skeletal maturation of female children in Shanghai].
Sun, Yan; Chen, Rong-jing; Yu, Quan; Fan, Li; Chen, Wei; Shen, Gang
2009-06-01
To establish a method for quantitatively evaluating skeletal maturation of cervical vertebrae of female children in Shanghai. The samples were selected from lateral cephalometric radiographs of 240 Shanghai girls, aged 8 to 15 years. The parameters were measured to indicate the morphological changes of the third (C3) and fourth (C4) vertebrae in width, height and the depth of the inferior curvature. The independent-sample t test and stepwise multiple regression analysis were used to estimate the growth status and the ratios of C3, C4 cervical vertebrae by SPSS 15.0 software package. The physical and morphological contour of C3, C4 cervical vertebrae increased proportionately with the increment of age. The regression formula for indicating cervical vertebral skeletal age of female children in Shanghai was expressed by the equation Y= -5.696+8.010 AH3/AP3+6.654 AH3/H3+6.045AH4/PH4 (r=0.912). The regression formula resulted from morphological measurements quantitatively indicates the skeletal maturation of cervical vertebrae of female children in Shanghai.
The roles of family B and D DNA polymerases in Thermococcus species 9°N Okazaki fragment maturation.
Greenough, Lucia; Kelman, Zvi; Gardner, Andrew F
2015-05-15
During replication, Okazaki fragment maturation is a fundamental process that joins discontinuously synthesized DNA fragments into a contiguous lagging strand. Efficient maturation prevents repeat sequence expansions, small duplications, and generation of double-stranded DNA breaks. To address the components required for the process in Thermococcus, Okazaki fragment maturation was reconstituted in vitro using purified proteins from Thermococcus species 9°N or cell extracts. A dual color fluorescence assay was developed to monitor reaction substrates, intermediates, and products. DNA polymerase D (polD) was proposed to function as the replicative polymerase in Thermococcus replicating both the leading and the lagging strands. It is shown here, however, that it stops before the previous Okazaki fragments, failing to rapidly process them. Instead, Family B DNA polymerase (polB) was observed to rapidly fill the gaps left by polD and displaces the downstream Okazaki fragment to create a flap structure. This flap structure was cleaved by flap endonuclease 1 (Fen1) and the resultant nick was ligated by DNA ligase to form a mature lagging strand. The similarities to both bacterial and eukaryotic systems and evolutionary implications of archaeal Okazaki fragment maturation are discussed. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
End-to-end operations at the National Radio Astronomy Observatory
NASA Astrophysics Data System (ADS)
Radziwill, Nicole M.
2008-07-01
In 2006 NRAO launched a formal organization, the Office of End to End Operations (OEO), to broaden access to its instruments (VLA/EVLA, VLBA, GBT and ALMA) in the most cost-effective ways possible. The VLA, VLBA and GBT are mature instruments, and the EVLA and ALMA are currently under construction, which presents unique challenges for integrating software across the Observatory. This article 1) provides a survey of the new developments over the past year, and those planned for the next year, 2) describes the business model used to deliver many of these services, and 3) discusses the management models being applied to ensure continuous innovation in operations, while preserving the flexibility and autonomy of telescope software development groups.
GRIDVIEW: Recent Improvements in Research and Education Software for Exploring Mars Topography
NASA Technical Reports Server (NTRS)
Roark, J. H.; Masuoka, C. M.; Frey, H. V.
2004-01-01
GRIDVIEW is being developed by the GEODYNAMICS Branch at NASA's Goddard Space Flight Center and can be downloaded on the web at http://geodynamics.gsfc.nasa.gov/gridview/. The program is very mature and has been successfully used for more than four years, but is still under development as we add new features for data analysis and visualization. The software can run on any computer supported by the IDL virtual machine application supplied by RSI. The virtual machine application is currently available for recent versions of MS Windows, MacOS X, Red Hat Linux and UNIX. Minimum system memory requirement is 32 MB, however loading large data sets may require larger amounts of RAM to function adequately.
NASA Astrophysics Data System (ADS)
Frossard, Frédérique; Trifonova, Anna; Barajas Frutos, Mario
The isolation of rural communities creates special necessities for teachers and students in rural schools. The present article describes "Rural Virtual School", a Virtual Community of Practice (VCoP) in which Spanish teachers of rural schools share learning resources and teaching methodologies through social software applications. The article arrives to an evolutionary model, in which the use of the social software tools evolves together with the needs and the activities of the VCoP through the different stages of its lifetime. Currently, the community has reached a high level of maturity and, in order to keep its momentum, the members intentionally use appropriate technologies specially designed to enhance rich innovative educational approaches, through which they collaboratively generate creative practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vijayasaradhi, S.; Doskoch, P.M.; Houghton, A.N.
1991-10-01
A 75-kDa melanosomal glycoprotein (gp75) is the product of a gene that maps to the b (brown) locus, a genetic locus that determines coat color in the mouse. The b locus is conserved (88% identity) between mouse and human. The mouse monoclonal antibody TA99 was used to study the biosynthesis and processing of gp75. gp75 was synthesized as a 55-kDa polypeptide, glycosylated by addition and processing of five or more Asn-linked carbohydrate chains through the cis and trans Golgi, and transported to melanosomes as a mature 75-kDa form. Synthesis and processing of gp75 was rapid (T{sub 1/2} < 30 min),more » and early steps in processing were required for efficient export of gp75 was quite stable in the melanosome. Studies with inhibitors of steps in oligosaccharide processing showed that alternative forms of gp75 were generated during trimming reactions by mannosidase IA/IB and that further maturation resulted in the two mature forms of gp75. The authors purpose that the kinetics of biosynthesis and processing reflect events in the biogenesis and maturation of melanosomes.« less
An Environmental Management Maturity Model of Construction Programs Using the AHP-Entropy Approach.
Bai, Libiao; Wang, Hailing; Huang, Ning; Du, Qiang; Huang, Youdan
2018-06-23
The accelerating process of urbanization in China has led to considerable opportunities for the development of construction projects, however, environmental issues have become an important constraint on the implementation of these projects. To quantitatively describe the environmental management capabilities of such projects, this paper proposes a 2-dimensional Environmental Management Maturity Model of Construction Program (EMMMCP) based on an analysis of existing projects, group management theory and a management maturity model. In this model, a synergetic process was included to compensate for the lack of consideration of synergies in previous studies, and it was involved in the construction of the first dimension, i.e., the environmental management index system. The second dimension, i.e., the maturity level of environment management, was then constructed by redefining the hierarchical characteristics of construction program (CP) environmental management maturity. Additionally, a mathematical solution to this proposed model was derived via the Analytic Hierarchy Process (AHP)-entropy approach. To verify the effectiveness and feasibility of this proposed model, a computational experiment was conducted, and the results show that this approach could not only measure the individual levels of different processes, but also achieve the most important objective of providing a reference for stakeholders when making decisions on the environmental management of construction program, which reflects this model is reasonable for evaluating the level of environmental management maturity in CP. To our knowledge, this paper is the first study to evaluate the environmental management maturity levels of CP, which would fill the gap between project program management and environmental management and provide a reference for relevant management personnel to enhance their environmental management capabilities.
Genomic profiling of bovine corpus luteum maturation
Wigoda, Noa; Ben-Dor, Shifra; Orr, Irit; Meidan, Rina
2018-01-01
To unveil novel global changes associated with corpus luteum (CL) maturation, we analyzed transcriptome data for the bovine CL on days 4 and 11, representing the developing vs. mature gland. Our analyses revealed 681 differentially expressed genes (363 and 318 on day 4 and 11, respectively), with ≥2 fold change and FDR of <5%. Different gene ontology (GO) categories were represented prominently in transcriptome data at these stages (e.g. days 4: cell cycle, chromosome, DNA metabolic process and replication and on day 11: immune response; lipid metabolic process and complement activation). Based on bioinformatic analyses, select genes expression in day 4 and 11 CL was validated with quantitative real-time PCR. Cell specific expression was also determined in enriched luteal endothelial and steroidogenic cells. Genes related to the angiogenic process such as NOS3, which maintains dilated vessels and MMP9, matrix degrading enzyme, were higher on day 4. Importantly, our data suggests day 11 CL acquire mechanisms to prevent blood vessel sprouting and promote their maturation by expressing NOTCH4 and JAG1, greatly enriched in luteal endothelial cells. Another endothelial specific gene, CD300LG, was identified here in the CL for the first time. CD300LG is an adhesion molecule enabling lymphocyte migration, its higher levels at mid cycle are expected to support the transmigration of immune cells into the CL at this stage. Together with steroidogenic genes, most of the genes regulating de-novo cholesterol biosynthetic pathway (e.g HMGCS, HMGCR) and cholesterol uptake from plasma (LDLR, APOD and APOE) were upregulated in the mature CL. These findings provide new insight of the processes involved in CL maturation including blood vessel growth and stabilization, leucocyte transmigration as well as progesterone synthesis as the CL matures. PMID:29590145
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes Used in... revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for... Developing a Software Project Life Cycle Process,'' issued 2006, with the clarifications and exceptions as...
ERIC Educational Resources Information Center
Rong, Guoping; Shao, Dong
2012-01-01
The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…
Mature Students Speak Up: Career Exploration and the Working Alliance
ERIC Educational Resources Information Center
Pott, Terilyn
2015-01-01
This exploratory study was undertaken to learn more about how mature students perceive the career counselling process in a post-secondary institution. Through the use of critical incident technique this study examined how three mature students interpret their relationship between themselves and their counsellors. Significant factors identified as…
NASA Technical Reports Server (NTRS)
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Software component quality evaluation
NASA Technical Reports Server (NTRS)
Clough, A. J.
1991-01-01
The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-07-01
New hardware and software tools build on existing platforms and add performance and ease-of-use benefits as the struggle to find and produce hydrocarbons at the lowest cost becomes more and more competitive. Software tools now provide geoscientists and petroleum engineers with a better understanding of reservoirs from the shape and makeup of formation to behavior projections as hydrocarbons are extracted. Petroleum software tools allow scientists to simulate oil flow, predict the life expectancy of a reservoir, and even help determine how to extend the life and economic viability of the reservoir. The requirement of the petroleum industry to find andmore » extract petroleum more efficiently drives the solutions provided by software and service companies. To one extent or another, most of the petroleum software products available today have achieved an acceptable level of competency. Innovative, high-impact products from small, focussed companies often were bought out by larger companies with deeper pockets if their developers couldn`t fund their expansion. Other products disappeared from the scene, because they were unable to evolve fast enough to compete. There are still enough small companies around producing excellent products to prevent the marketplace from feeling too narrow and lacking in choice. Oil companies requiring specific solutions to their problems have helped fund product development within the commercial sector. As the industry has matured, strategic alliances between vendors, both hardware and software, have provided market advantages, often combining strengths to enter new and undeveloped areas for technology. The pace of technological development has been fast and constant.« less
The ALMA common software: dispatch from the trenches
NASA Astrophysics Data System (ADS)
Schwarz, J.; Sommer, H.; Jeram, B.; Sekoranja, M.; Chiozzi, G.; Grimstrup, A.; Caproni, A.; Paredes, C.; Allaert, E.; Harrington, S.; Turolla, S.; Cirami, R.
2008-07-01
The ALMA Common Software (ACS) provides both an application framework and CORBA-based middleware for the distributed software system of the Atacama Large Millimeter Array. Building upon open-source tools such as the JacORB, TAO and OmniORB ORBs, ACS supports the development of component-based software in any of three languages: Java, C++ and Python. Now in its seventh major release, ACS has matured, both in its feature set as well as in its reliability and performance. However, it is only recently that the ALMA observatory's hardware and application software has reached a level at which it can exploit and challenge the infrastructure that ACS provides. In particular, the availability of an Antenna Test Facility(ATF) at the site of the Very Large Array in New Mexico has enabled us to exercise and test the still evolving end-to-end ALMA software under realistic conditions. The major focus of ACS, consequently, has shifted from the development of new features to consideration of how best to use those that already exist. Configuration details which could be neglected for the purpose of running unit tests or skeletal end-to-end simulations have turned out to be sensitive levers for achieving satisfactory performance in a real-world environment. Surprising behavior in some open-source tools has required us to choose between patching code that we did not write or addressing its deficiencies by implementing workarounds in our own software. We will discuss these and other aspects of our recent experience at the ATF and in simulation.
Healthcare software assurance.
Cooper, Jason G; Pauley, Keith A
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.
Cooper, Jason G.; Pauley, Keith A.
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324
Physical abrasion of mafic minerals and basalt grains: application to Martian aeolian deposits
Cornwall, Carin; Bandfield, Joshua L.; Titus, Timothy N.; Schreiber, B. C.; Montgomery, D.R.
2015-01-01
Sediment maturity, or the mineralogical and physical characterization of sediment deposits, has been used to locate sediment source, transport medium and distance, weathering processes, and paleoenvironments on Earth. Mature terrestrial sands are dominated by quartz, which is abundant in source lithologies on Earth and is physically and chemically stable under a wide range of conditions. Immature sands, such as those rich in feldspars or mafic minerals, are composed of grains that are easily physically weathered and highly susceptible to chemical weathering. On Mars, which is predominately mafic in composition, terrestrial standards of sediment maturity are not applicable. In addition, the martian climate today is cold, dry and sediments are likely to be heavily influenced by physical weathering rather than chemical weathering. Due to these large differences in weathering processes and composition, martian sediments require an alternate maturity index. Abrason tests have been conducted on a variety of mafic materials and results suggest that mature martian sediments may be composed of well sorted, well rounded, spherical basalt grains. In addition, any volcanic glass present is likely to persist in a mechanical weathering environment while chemically altered products are likely to be winnowed away. A modified sediment maturity index is proposed that can be used in future studies to constrain sediment source, paleoclimate, mechanisms for sediment production, and surface evolution. This maturity index may also provide details about erosional and sediment transport systems and preservation processes of layered deposits.
Effects of inhibitors on 1-methyladenine induced maturation of starfish oocytes
NASA Astrophysics Data System (ADS)
Lee, Harold H.; Xu, Quanhan
1986-12-01
1-methladenine (1-MA) induces starfish oocytes maturation via surface reaction followed by the appearance of a cytoplasmic maturation factor which in turn induces germinal vesicle breakdown (GVBD) to resume meiosis. Cellular mechanisms involved in GVBD were investigated by microinjection of metabolic inhibitors. Colchicine (Co) inhibited maturation, cytochalasin-B (CB) delayed GVBD and actinomycin-D-(Act-D) and puromycin (Pu) had no effect. It appears that the microtubule and the microfilament systems are associated with the nuclear membrane dissolution during the process of oocyte maturation of starfish.
Model-based software process improvement
NASA Technical Reports Server (NTRS)
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Defective postsecretory maturation of MUC5B mucin in cystic fibrosis airways
Abdullah, Lubna H.; Evans, Jessica R.; Wang, T. Tiffany; Ford, Amina A.; Makhov, Alexander M.; Nguyen, Kristine; Coakley, Raymond D.; Griffith, Jack D.; Davis, C. William; Ballard, Stephen T.
2017-01-01
In cystic fibrosis (CF), airway mucus becomes thick and viscous, and its clearance from the airways is impaired. The gel-forming mucins undergo an ordered “unpacking/maturation” process after granular release that requires an optimum postsecretory environment, including hydration and pH. We hypothesized that this unpacking process is compromised in the CF lung due to abnormal transepithelial fluid transport that reduces airway surface hydration and alters ionic composition. Using human tracheobronchial epithelial cells derived from non-CF and CF donors and mucus samples from human subjects and domestic pigs, we investigated the process of postsecretory mucin unfolding/maturation, how these processes are defective in CF airways, and the probable mechanism underlying defective unfolding. First, we found that mucins released into a normal lung environment transform from a compact granular form to a linear form. Second, we demonstrated that this maturation process is defective in the CF airway environment. Finally, we demonstrated that independent of HCO3− and pH levels, airway surface dehydration was the major determinant of this abnormal unfolding process. This defective unfolding/maturation process after granular release suggests that the CF extracellular environment is ion/water depleted and likely contributes to abnormal mucus properties in CF airways prior to infection and inflammation. PMID:28352653
Software Process Improvement: Supporting the Linking of the Software and the Business Strategies
NASA Astrophysics Data System (ADS)
Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti
The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.
Precursor processing for plant peptide hormone maturation by subtilisin-like serine proteinases.
Schardon, Katharina; Hohl, Mathias; Graff, Lucile; Pfannstiel, Jens; Schulze, Waltraud; Stintzi, Annick; Schaller, Andreas
2016-12-23
Peptide hormones that regulate plant growth and development are derived from larger precursor proteins by proteolytic processing. Our study addressed the role of subtilisin-like proteinases (SBTs) in this process. Using tissue-specific expression of proteinase inhibitors as a tool to overcome functional redundancy, we found that SBT activity was required for the maturation of IDA (INFLORESCENCE DEFICIENT IN ABSCISSION), a peptide signal for the abscission of floral organs in Arabidopsis We identified three SBTs that process the IDA precursor in vitro, and this processing was shown to be required for the formation of mIDA (the mature and bioactive form of IDA) as the endogenous signaling peptide in vivo. Hence, SBTs act as prohormone convertases in plants, and several functionally redundant SBTs contribute to signal biogenesis. Copyright © 2016, American Association for the Advancement of Science.
7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.
Considerations for choosing an electronic medical record for an ophthalmology practice.
DeBry, P W
2001-04-01
To give a brief overview of issues pertinent to selecting an ophthalmic electronic medical record (EMR) program and to outline the company demographics and software capabilities of the major vendors in this area. Software companies shipping an EMR package were contacted to obtain information on their software and company demographics. The focus was on companies selectively marketing to ophthalmology practices, and, therefore, most were selected based on their representation at the 1998 and/or 1999 American Academy of Ophthalmology meeting. Software companies that responded to repeated inquiries in a timely fashion were included. Sixteen companies were evaluated. Electronic medical records packages ranged from $3000 to $80 000 (mean, approximately $30 000). Company demographics revealed a range from 1 to 1600 employees (mean, 204). Most of these companies have been in business for 6 years or less (range, 1-15 years; mean, 6 years). My opinions concerning various aspects of the EMR are presented. There is a wide range of EMR products available for the ophthalmology practice. Computer technology has matured to a point at which the graphical demands of the ophthalmology EMR can be satisfied. Weaknesses do exist in the inherent difficulty of recording an ophthalmology encounter, the relative adolescence of software companies, and the lack of standards in the industry.
Software Development Standard Processes (SDSP)
NASA Technical Reports Server (NTRS)
Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.;
2011-01-01
A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Analysis of epididymal sperm maturation by MALDI profiling and top-down mass spectrometry.
Labas, Valérie; Spina, Lucie; Belleannee, Clémence; Teixeira-Gomes, Ana-Paula; Gargaros, Audrey; Dacheux, Françoise; Dacheux, Jean-Louis
2015-01-15
The fertilization ability of male gametes is achieved after their transit through the epididymis where important post-gonadal differentiation occurs in different cellular compartments. Most of these maturational modifications occur at the protein level. The epididymal sperm maturation process was investigated using the ICM-MS (Intact Cell MALDI-TOF MS) approach on boar spermatozoa isolated from four different epididymal regions (immature to mature stage). Differential and quantitative MALDI-TOF profiling for whole cells or sub-cellular fractions was combined with targeted top-down MS in order to identify endogenous biomolecules. Using this approach, 172m/z peaks ranging between 2 and 20kDa were found to be modified during maturation of sperm. Using top-down MS, 62m/z were identified corresponding to peptidoforms/proteoforms with post-translational modifications (MS data are available via ProteomeXchange with identifier PXD001303). Many of the endogenous peptides were characterized as N-, C-terminal sequences or internal fragments of proteins presenting specific cleavages, suggesting the presence of sequential protease activities in the spermatozoa. This is the first time that such proteolytic activities could be evidenced for various sperm proteins through quantification of their proteolytic products. ICM-MS/top-down MS thus proved to be a valid approach for peptidome/degradome studies and provided new contributions to understanding of the maturation process of the male gamete involved in the development of male fertility. This peptidomic study (i) characterized the peptidome of epididymal spermatozoa from boar (Sus scrofa); (ii) established characteristic molecular phenotypes distinguishing degrees of maturation of spermatozoa during epididymal transit, and (iii) revealed that protease activities were at the origin of numerous peptides from known and unknown proteins involved in sperm maturation and/or fertility processes. Copyright © 2014 Elsevier B.V. All rights reserved.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
NASA Astrophysics Data System (ADS)
Piras, Annamaria; Malucchi, Giovanni
2012-08-01
In the design and development phase of a new program one of the critical aspects is the integration of all the functional requirements of the system and the control of the overall consistency between the identified needs on one side and the available resources on the other side, especially when both the required needs and available resources are not yet consolidated, but they are evolving as the program maturity increases.The Integrated Engineering Harness Avionics and Software database (IDEHAS) is a tool that has been developed to support this process in the frame of the Avionics and Software disciplines through the different phases of the program. The tool is in fact designed to allow an incremental build up of the avionics and software systems, from the description of the high level architectural data (available in the early stages of the program) to the definition of the pin to pin connectivity information (typically consolidated in the design finalization stages) and finally to the construction and validation of the detailed telemetry parameters and commands to be used in the test phases and in the Mission Control Centre. The key feature of this approach and of the associated tool is that it allows the definition and the maintenance / update of all these data in a single, consistent environment.On one side a system level and concurrent approach requires the feasibility to easily integrate and update the best data available since the early stages of a program in order to improve confidence in the consistency and to control the design information.On the other side, the amount of information of different typologies and the cross-relationships among the data imply highly consolidated structures requiring lot of checks to guarantee the data content consistency with negative effects on simplicity and flexibility and often limiting the attention to special needs and to the interfaces with other disciplines.
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
Information Flow Integrity for Systems of Independently-Developed Components
2015-06-22
We also examined three programs (Apache, MySQL , and PHP) in detail to evaluate the efficacy of using the provided package test suites to generate...method are just as effective as hooks that were manually placed over the course of years while greatly reducing the burden on programmers. ”Leveraging...to validate optimizations of real-world, mature applications: the Apache software suite, the Mozilla Suite, and the MySQL database. ”Validating Library
Software Capability Evaluation (SCE) Version 2.0 Implementation Guide
1994-02-01
Affected By SCE B-40 Figure 3-1 SCE Usage Decision Making Criteria 3-44 Figure 3-2 Estimated SCE Labor For One Source Selection 3-53 Figure 3-3 SCE...incorporated into the source selection sponsoring organization’s technical/management team for incorporation into acquisition decisions . The SCE team...expertise, past performance, and organizational capacity in acquisition decisions . The Capability Maturity Model Basic Concepts The CMM is based on the
Additive effect of multiple pharmacological chaperones on maturation of CFTR processing mutants
Wang, Ying; Loo, Tip W.; Bartlett, M. Claire; Clarke, David M.
2007-01-01
The most common cause of CF (cystic fibrosis) is the deletion of Phe508 (ΔF508) in the CFTR [CF TM (transmembrane) conductance regulator] chloride channel. One major problem with ΔF508 CFTR is that the protein is defective in folding so that little mature protein is delivered to the cell surface. Expression of ΔF508 CFTR in the presence of small molecules known as correctors or pharmacological chaperones can increase the level of mature protein. Unfortunately, the efficiency of corrector-induced maturation of ΔF508 CFTR is probably too low to have therapeutic value and approaches are needed to increase maturation efficiency. We postulated that expression of ΔF508 CFTR in the presence of multiple correctors that bound to different sites may have an additive effect on maturation. In support of this mechanism, we found that expression of P-glycoprotein (CFTR's sister protein) processing mutants in the presence of two compounds that bind to different sites (rhodamine B and Hoechst 33342) had an additive effect on maturation. Therefore we tested whether expression of ΔF508 CFTR in the presence of combinations of three different classes of corrector molecules would increase its maturation efficiency. It was found that the combination of the quinazoline VRT-325 together with the thiazole corr-2b or bisaminomethylbithiazole corr-4a doubled the steady-state maturation efficiency of ΔF508 CFTR (approx. 40% of total CFTR was mature protein) compared with expression in the presence of a single compound. The additive effect of the correctors on ΔF508 CFTR maturation suggests that they directly interact at different sites of the protein. PMID:17535157
Spiritual Maturation and Religious Behaviors in Christian University Students
ERIC Educational Resources Information Center
Welch, Ronald D.; Mellberg, Kimberlee
2008-01-01
Spiritual maturation processes of internalization and questing were assessed at a Christian university to determine their relationship to year in school and certain religious behaviors. This was a first step toward the development of a new model of Christian higher education that will intentionally facilitate spiritual maturation. A group of 179…
Cone and Seed Maturation of Southern Pines
James P. Barnett
1976-01-01
If slightly reduced yields and viability are acceptable, loblolly and slash cone collections can begin 2 to 3 weeks before maturity if the cones are stored before processing. Longleaf(P. palestris Mill.) pine cones should be collected only when mature, as storage decreased germination of seeds from immature cones. Biochemical analyses to determine reducing sugar...
ERIC Educational Resources Information Center
Argon, Türkan; Sezen-Gültekin, Gözde
2016-01-01
Moral maturity, defined as the competence in moral emotions, thoughts, judgments, attitudes and behaviors, is one of the most important qualities that the would-be teachers at Faculties of Education must possess. Teachers with moral maturity will train students with the qualities of reliability, responsibility, fairness, objectivity, consistency…
USDA-ARS?s Scientific Manuscript database
Maturation of Atlantic salmon Salmo salar is an extremely complex process, particularly in aquaculture systems, with many variables (known or otherwise) having the capacity to influence the timing and prevalence of maturation, and acting as promoters and/or inhibitors of sexual development. The vast...
USDA-ARS?s Scientific Manuscript database
Micronaire is a key cotton quality assessment property, impacting downstream fiber processing and dye consistency. A component of micronaire is fiber maturity (degree of secondary wall development). Historically, micronaire and maturity are measured in a laboratory under tight environmental condit...
A Maturity Model for Assessing the Use of ICT in School Education
ERIC Educational Resources Information Center
Solar, Mauricio; Sabattin, Jorge; Parada, Victor
2013-01-01
This article describes an ICT-based and capability-driven model for assessing ICT in education capabilities and maturity of schools. The proposed model, called ICTE-MM (ICT in School Education Maturity Model), has three elements supporting educational processes: information criteria, ICT resources, and leverage domains. Changing the traditional…
Cytoskeletal changes in oocytes and early embryos during in vitro fertilization process in mice.
Gumus, E; Bulut, H E; Kaloglu, C
2010-02-01
The cytoskeleton plays crucial roles in the development and fertilization of germ cells and in the early embryo development. The growth, maturation and fertilization of oocytes require an active movement and a correct localization of cellular organelles. This is performed by the re-organization of microtubules and actin filaments. Therefore, the aim of the present study was to determine the changes in cytoskeleton during in vitro fertilization process using appropriate immunofluorescence techniques. While the chromatin content was found to be scattered throughout the nucleus during the oocyte maturation period, it was seen only around nucleolus following the completion of the maturation. Microtubules, during oocyte maturation, were regularly distributed throughout the ooplasm which was then localized in the subcortical region of oocytes. Similarly microfilaments were scattered throughout the ooplasm during the oocyte maturation period whereas they were seen in the subcortical region around the polar body and above the meiotic spindle throughout the late developmental stages. In conclusion, those changes occurred in microtubules and microfilaments might be closely related to the re-organization of the genetic material during the oocyte maturation and early embryo development.
MHC drives TCR repertoire shaping, but not maturation, in recent thymic emigrants.
Houston, Evan G; Fink, Pamela J
2009-12-01
After developing in the thymus, recent thymic emigrants (RTEs) enter the lymphoid periphery and undergo a maturation process as they transition into the mature naive (MN) T cell compartment. This maturation presumably shapes RTEs into a pool of T cells best fit to function robustly in the periphery without causing autoimmunity; however, the mechanism and consequences of this maturation process remain unknown. Using a transgenic mouse system that specifically labels RTEs, we tested the influence of MHC molecules, key drivers of intrathymic T cell selection and naive peripheral T cell homeostasis, in shaping the RTE pool in the lymphoid periphery. We found that the TCRs expressed by RTEs are skewed to longer CDR3 regions compared with those of MN T cells, suggesting that MHC does streamline the TCR repertoire of T cells as they transition from the RTE to the MN T cell stage. This conclusion is borne out in studies in which the representation of individual TCRs was followed as a function of time since thymic egress. Surprisingly, we found that MHC is dispensable for the phenotypic and functional maturation of RTEs.
Clarity: An Open Source Manager for Laboratory Automation
Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.
2013-01-01
Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
Tomato seeds maturity detection system based on chlorophyll fluorescence
NASA Astrophysics Data System (ADS)
Li, Cuiling; Wang, Xiu; Meng, Zhijun
2016-10-01
Chlorophyll fluorescence intensity can be used as seed maturity and quality evaluation indicator. Chlorophyll fluorescence intensity of seed coats is tested to judge the level of chlorophyll content in seeds, and further to judge the maturity and quality of seeds. This research developed a detection system of tomato seeds maturity based on chlorophyll fluorescence spectrum technology, the system included an excitation light source unit, a fluorescent signal acquisition unit and a data processing unit. The excitation light source unit consisted of two high power LEDs, two radiators and two constant current power supplies, and it was designed to excite chlorophyll fluorescence of tomato seeds. The fluorescent signal acquisition unit was made up of a fluorescence spectrometer, an optical fiber, an optical fiber scaffolds and a narrowband filter. The data processing unit mainly included a computer. Tomato fruits of green ripe stage, discoloration stage, firm ripe stage and full ripe stage were harvested, and their seeds were collected directly. In this research, the developed tomato seeds maturity testing system was used to collect fluorescence spectrums of tomato seeds of different maturities. Principal component analysis (PCA) method was utilized to reduce the dimension of spectral data and extract principal components, and PCA was combined with linear discriminant analysis (LDA) to establish discriminant model of tomato seeds maturity, the discriminant accuracy was greater than 90%. Research results show that using chlorophyll fluorescence spectrum technology is feasible for seeds maturity detection, and the developed tomato seeds maturity testing system has high detection accuracy.
Zhang, Hui; Wang, Jing; Sun, Ling; Xu, Qiuqin; Hou, Miao; Ding, Yueyue; Huang, Jie; Chen, Ye; Cao, Lei; Zhang, Jianmin; Qian, Weiguo; Lv, Haitao
2015-01-01
Obesity has become an increasingly serious health problem and popular research topic. It is associated with many diseases, especially cardiovascular disease (CVD)-related endothelial dysfunction. This study analyzed genes related to endothelial dysfunction and obesity and then summarized their most significant signaling pathways. Genes related to vascular endothelial dysfunction and obesity were extracted from a PubMed database, and analyzed by STRING, DAVID, and Gene-Go Meta-Core software. 142 genes associated with obesity were found to play a role in endothelial dysfunction in PubMed. A significant pathway (Angiotensin system maturation in protein folding and maturation) associated with obesity and endothelial dysfunction was explored. The genes and the pathway explored may play an important role in obesity. Further studies about preventing vascular endothelial dysfunction obesity should be conducted through targeting these loci and pathways.
Diederichs, Sven; Haber, Daniel A
2007-12-14
MicroRNAs are small endogenous noncoding RNAs involved in posttranscriptional gene regulation. During microRNA biogenesis, Drosha and Dicer process the primary transcript (pri-miRNA) through a precursor hairpin (pre-miRNA) to the mature miRNA. The miRNA is incorporated into the RNA-Induced Silencing Complex (RISC) with Argonaute proteins, the effector molecules in RNA interference (RNAi). Here, we show that all Argonautes elevate mature miRNA expression posttranscriptionally, independent of RNase activity. Also, we identify a role for the RISC slicer Argonaute2 (Ago2) in cleaving the pre-miRNA to an additional processing intermediate, termed Ago2-cleaved precursor miRNA or ac-pre-miRNA. This endogenous, on-pathway intermediate results from cleavage of the pre-miRNA hairpin 12 nucleotides from its 3'-end. By analogy to siRNA processing, Ago2 cleavage may facilitate removal of the nicked passenger strand from RISC after maturation. The multiple roles of Argonautes in the RNAi effector phase and miRNA biogenesis and maturation suggest coordinate regulation of microRNA expression and function.
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Structure and Uncoating of Immature Adenovirus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perez-Berna, A.J.; Mangel, W.; Marabini, R.
2009-09-18
Maturation via proteolytic processing is a common trait in the viral world and is often accompanied by large conformational changes and rearrangements in the capsid. The adenovirus protease has been shown to play a dual role in the viral infectious cycle: (a) in maturation, as viral assembly starts with precursors to several of the structural proteins but ends with proteolytically processed versions in the mature virion, and (b) in entry, because protease-impaired viruses have difficulties in endosome escape and uncoating. Indeed, viruses that have not undergone proteolytic processing are not infectious. We studied the three-dimensional structure of immature adenovirus particlesmore » as represented by the adenovirus type 2 thermosensitive mutant ts1 grown under non-permissive conditions and compared it with the mature capsid. Our three-dimensional electron microscopy maps at subnanometer resolution indicate that adenovirus maturation does not involve large-scale conformational changes in the capsid. Difference maps reveal the locations of unprocessed peptides pIIIa and pVI and help define their role in capsid assembly and maturation. An intriguing difference appears in the core, indicating a more compact organization and increased stability of the immature cores. We have further investigated these properties by in vitro disassembly assays. Fluorescence and electron microscopy experiments reveal differences in the stability and uncoating of immature viruses, both at the capsid and core levels, as well as disassembly intermediates not previously imaged.« less
CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 9
2005-09-01
2004. 12. Humphrey , Watts . Introduction to the Personal Software Process SM. Addison- Wesley 1997. 13. Humphrey , Watts . Introduction to the Team...Personal Software ProcessSM (PSPSM)is a software development process orig- inated by Watts Humphrey at the Software Engineering Institute (SEI) in the...meets its commitments and bring a sense of control and predictability into an apparently chaotic project.u References 1. Humphrey , Watts . Coaching
Ying, Diwen; Peng, Juan; Xu, Xinyan; Li, Kan; Wang, Yalin; Jia, Jinping
2012-08-30
A comparative study of treating mature landfill leachate with various treatment processes was conducted to investigate whether the method of combined processes of internal micro-electrolysis (IME) without aeration and IME with full aeration in one reactor was an efficient treatment for mature landfill leachate. A specifically designed novel sequencing batch internal micro-electrolysis reactor (SIME) with the latest automation technology was employed in the experiment. Experimental data showed that combined processes obtained a high COD removal efficiency of 73.7 ± 1.3%, which was 15.2% and 24.8% higher than that of the IME with and without aeration, respectively. The SIME reactor also exhibited a COD removal efficiency of 86.1 ± 3.8% to mature landfill leachate in the continuous operation, which is much higher (p<0.05) than that of conventional treatments of electrolysis (22.8-47.0%), coagulation-sedimentation (18.5-22.2%), and the Fenton process (19.9-40.2%), respectively. The innovative concept behind this excellent performance is a combination effect of reductive and oxidative processes of the IME, and the integration electro-coagulation. Optimal operating parameters, including the initial pH, Fe/C mass ratio, air flow rate, and addition of H(2)O(2), were optimized. All results show that the SIME reactor is a promising and efficient technology in treating mature landfill leachate. Copyright © 2012 Elsevier B.V. All rights reserved.
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
E. Gaige; D.B. Dail; D.Y. Hollinger; E.A. Davidson; I.J. Fernandez; H. Sievering; A. White; W. Halteman
2007-01-01
Most experimental additions of nitrogen to forest ecosystems apply the N to the forest floor, bypassing important processes taking place in the canopy, including canopy retention of N and/or conversion of N from one form to another. To quantify these processes, we carried out a large-scale experiment and determined the fate of nitrogen applied directly to a mature...
Thinning of the lateral prefrontal cortex during adolescence predicts emotion regulation in females.
Vijayakumar, Nandita; Whittle, Sarah; Yücel, Murat; Dennison, Meg; Simmons, Julian; Allen, Nicholas B
2014-11-01
Adolescence is a crucial period for the development of adaptive emotion regulation strategies. Despite the fact that structural maturation of the prefrontal cortex during adolescence is often assumed to underlie the maturation of emotion regulation strategies, no longitudinal studies have directly assessed this relationship. This study examined whether use of cognitive reappraisal strategies during late adolescence was predicted by (i) absolute prefrontal cortical thickness during early adolescence and (ii) structural maturation of the prefrontal cortex between early and mid-adolescence. Ninety-two adolescents underwent baseline and follow-up magnetic resonance imaging scans when they were aged approximately 12 and 16 years, respectively. FreeSurfer software was used to obtain cortical thickness estimates for three prefrontal regions [anterior cingulate cortex; dorsolateral prefrontal cortex (dlPFC); ventrolateral prefrontal cortex (vlPFC)]. The Emotion Regulation Questionnaire was completed when adolescents were aged approximately 19 years. Results showed that greater cortical thinning of the left dlPFC and left vlPFC during adolescence was significantly associated with greater use of cognitive reappraisal in females, though no such relationship was evident in males. Furthermore, baseline left dlPFC thickness predicted cognitive reappraisal at trend level. These findings suggest that cortical maturation may play a role in the development of adaptive emotion regulation strategies during adolescence. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
New Developments in the Technology Readiness Assessment Process in US DOE-EM - 13247
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krahn, Steven; Sutter, Herbert; Johnson, Hoyt
2013-07-01
A Technology Readiness Assessment (TRA) is a systematic, metric-based process and accompanying report that evaluates the maturity of the technologies used in systems; it is designed to measure technology maturity using the Technology Readiness Level (TRL) scale pioneered by the National Aeronautics and Space Administration (NASA) in the 1980's. More recently, DoD has adopted and provided systematic guidance for performing TRAs and determining TRLs. In 2007 the GAO recommended that the DOE adopt the NASA/DoD methodology for evaluating technology maturity. Earlier, in 2006-2007, DOE-EM had conducted pilot TRAs on a number of projects at Hanford and Savannah River. In Marchmore » 2008, DOE-EM issued a process guide, which established TRAs as an integral part of DOE-EM's Project Management Critical Decision Process. Since the development of its detailed TRA guidance in 2008, DOE-EM has continued to accumulate experience in the conduct of TRAs and the process for evaluating technology maturity. DOE has developed guidance on TRAs applicable department-wide. DOE-EM's experience with the TRA process, the evaluations that led to recently developed proposed revisions to the DOE-EM TRA/TMP Guide; the content of the proposed changes that incorporate the above lessons learned and insights are described. (authors)« less
Sastradipura, D F; Nakanishi, H; Tsukuba, T; Nishishita, K; Sakai, H; Kato, Y; Gotow, T; Uchiyama, Y; Yamamoto, K
1998-05-01
Cathepsin E is a major nonlysosomal, intracellular aspartic proteinase that localizes in various cellular compartments such as the plasma membrane, endosome-like organelles, and the endoplasmic reticulum (ER). To learn the segregation mechanisms of cathepsin E into its appropriate cellular destinations, the present studies were initiated to define the biosynthesis, processing, and intracellular localization as well as the site of proteolytic maturation of the enzyme in primary cultures of rat brain microglia. Immunohistochemical and immunoblot analyses revealed that cathepsin E was the most abundant in microglia among various brain cell types, where the enzyme existed predominantly as the mature enzyme. Immunoelectron microscopy studies showed the presence of the enzyme predominantly in the endosome-like vacuoles and partly in the vesicles located in the trans-Golgi area and the lumen of ER. In the primary cultured microglial cells labeled with [35S]methionine, >95% of labeled cathepsin E were represented by a 46-kDa polypeptide (reduced form) after a 30-min pulse. Most of it was proteolytically processed via a 44-kDa intermediate to a 42-kDa mature form within 4 h of chase. This processing was completely inhibited by bafilomycin A1, a specific inhibitor of vacuolar-type H+-ATPase. Brefeldin A, a blocker for the traffic of secretory proteins from the ER to the Golgi complex, also inhibited the processing of procathepsin E and enhanced its degradation. Procathepsin E, after pulse-labeling, showed complete susceptibility to endoglycosidase H, whereas the mature enzyme almost acquired resistance to endoglycosidases H as well as F. The present studies provide the first evidence that cathepsin E in microglia is first synthesized as the inactive precursor bearing high-mannose oligosaccharides and processed to the active mature enzyme with complex-type oligosaccharides via the intermediate form and that the final proteolytic maturation step occurs in endosome-like acidic compartments.
Climbing the ladder: capability maturity model integration level 3
NASA Astrophysics Data System (ADS)
Day, Bryce; Lutteroth, Christof
2011-02-01
This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.
Computation of Sensitivity Derivatives of Navier-Stokes Equations using Complex Variables
NASA Technical Reports Server (NTRS)
Vatsa, Veer N.
2004-01-01
Accurate computation of sensitivity derivatives is becoming an important item in Computational Fluid Dynamics (CFD) because of recent emphasis on using nonlinear CFD methods in aerodynamic design, optimization, stability and control related problems. Several techniques are available to compute gradients or sensitivity derivatives of desired flow quantities or cost functions with respect to selected independent (design) variables. Perhaps the most common and oldest method is to use straightforward finite-differences for the evaluation of sensitivity derivatives. Although very simple, this method is prone to errors associated with choice of step sizes and can be cumbersome for geometric variables. The cost per design variable for computing sensitivity derivatives with central differencing is at least equal to the cost of three full analyses, but is usually much larger in practice due to difficulty in choosing step sizes. Another approach gaining popularity is the use of Automatic Differentiation software (such as ADIFOR) to process the source code, which in turn can be used to evaluate the sensitivity derivatives of preselected functions with respect to chosen design variables. In principle, this approach is also very straightforward and quite promising. The main drawback is the large memory requirement because memory use increases linearly with the number of design variables. ADIFOR software can also be cumber-some for large CFD codes and has not yet reached a full maturity level for production codes, especially in parallel computing environments.
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Evan; Tan, Yan; Tan, Yuxiang
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile themore » glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.« less
Operable Data Management for Ocean Observing Systems
NASA Astrophysics Data System (ADS)
Chavez, F. P.; Graybeal, J. B.; Godin, M. A.
2004-12-01
As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).
Experimental Evaluation of a Serious Game for Teaching Software Process Modeling
ERIC Educational Resources Information Center
Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz
2015-01-01
Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…
Open source libraries and frameworks for biological data visualisation: a guide for developers.
Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio
2015-04-01
Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Open source libraries and frameworks for biological data visualisation: A guide for developers
Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio
2015-01-01
Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. PMID:25475079
A Maturity Model: Does It Provide a Path for Online Course Design?
ERIC Educational Resources Information Center
Neuhauser, Charlotte
2004-01-01
Maturity models are successfully used by organizations attempting to improve their processes, products, and delivery. As more faculty include online course design and teaching, a maturity model of online course design may serve as a tool in planning and assessing their courses for improvement based on best practices. This article presents such a…
Morphometry and Connectivity of the Fronto-Parietal Verbal Working Memory Network in Development
ERIC Educational Resources Information Center
Ostby, Ylva; Tamnes, Christian K.; Fjell, Anders M.; Walhovd, Kristine B.
2011-01-01
Two distinctly different maturational processes--cortical thinning and white matter maturation--take place in the brain as we mature from late childhood to adulthood. To what extent does each contribute to the development of complex cognitive functions like working memory? The independent and joint contributions of cortical thickness of regions of…
[Scars, physiology, classification and assessment].
Roques, Claude
2013-01-01
A skin scar is the sign of tissue repair following damage to the skin. Once formed, it follows a process of maturation which, after several months, results in a mature scar. This can be pathological with functional and/or aesthetic consequences. It is important to assess the scar as it matures in order to adapt the treatment to its evolution.
Biological maturation of youth athletes: assessment and implications.
Malina, Robert M; Rogol, Alan D; Cumming, Sean P; Coelho e Silva, Manuel J; Figueiredo, Antonio J
2015-07-01
The search for talent is pervasive in youth sports. Selection/exclusion in many sports follows a maturity-related gradient largely during the interval of puberty and growth spurt. As such, there is emphasis on methods for assessing maturation. Commonly used methods for assessing status (skeletal age, secondary sex characteristics) and estimating timing (ages at peak height velocity (PHV) and menarche) in youth athletes and two relatively recent anthropometric (non-invasive) methods (status-percentage of predicted near adult height attained at observation, timing-predicted maturity offset/age at PHV) are described and evaluated. The latter methods need further validation with athletes. Currently available data on the maturity status and timing of youth athletes are subsequently summarised. Selection for sport and potential maturity-related correlates are then discussed in the context of talent development and associated models. Talent development from novice to elite is superimposed on a constantly changing base-the processes of physical growth, biological maturation and behavioural development, which occur simultaneously and interact with each other. The processes which are highly individualised also interact with the demands of a sport per se and with involved adults (coaches, trainers, administrators, parents/guardians). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Demenev, A. G.
2018-02-01
The present work is devoted to analyze high-performance computing (HPC) infrastructure capabilities for aircraft engine aeroacoustics problems solving at Perm State University. We explore here the ability to develop new computational aeroacoustics methods/solvers for computer-aided engineering (CAE) systems to handle complicated industrial problems of engine noise prediction. Leading aircraft engine engineering company, including “UEC-Aviadvigatel” JSC (our industrial partners in Perm, Russia), require that methods/solvers to optimize geometry of aircraft engine for fan noise reduction. We analysed Perm State University HPC-hardware resources and software services to use efficiently. The performed results demonstrate that Perm State University HPC-infrastructure are mature enough to face out industrial-like problems of development CAE-system with HPC-method and CFD-solvers.
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2016-01-01
To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.
Woodruff Carr, Kali; Fitzroy, Ahren B; Tierney, Adam; White-Schwoch, Travis; Kraus, Nina
2017-01-01
Speech communication involves integration and coordination of sensory perception and motor production, requiring precise temporal coupling. Beat synchronization, the coordination of movement with a pacing sound, can be used as an index of this sensorimotor timing. We assessed adolescents' synchronization and capacity to correct asynchronies when given online visual feedback. Variability of synchronization while receiving feedback predicted phonological memory and reading sub-skills, as well as maturation of cortical auditory processing; less variable synchronization during the presence of feedback tracked with maturation of cortical processing of sound onsets and resting gamma activity. We suggest the ability to incorporate feedback during synchronization is an index of intentional, multimodal timing-based integration in the maturing adolescent brain. Precision of temporal coding across modalities is important for speech processing and literacy skills that rely on dynamic interactions with sound. Synchronization employing feedback may prove useful as a remedial strategy for individuals who struggle with timing-based language learning impairments. Copyright © 2016 Elsevier Inc. All rights reserved.
Smith, Ashley R; Chein, Jason; Steinberg, Laurence
2013-07-01
While there is little doubt that risk-taking is generally more prevalent during adolescence than before or after, the underlying causes of this pattern of age differences have long been investigated and debated. One longstanding popular notion is the belief that risky and reckless behavior in adolescence is tied to the hormonal changes of puberty. However, the interactions between pubertal maturation and adolescent decision making remain largely understudied. In the current review, we discuss changes in decision making during adolescence, focusing on the asynchronous development of the affective, reward-focused processing system and the deliberative, reasoned processing system. As discussed, differential maturation in the structure and function of brain systems associated with these systems leaves adolescents particularly vulnerable to socio-emotional influences and risk-taking behaviors. We argue that this asynchrony may be partially linked to pubertal influences on development and specifically on the maturation of the affective, reward-focused processing system. Copyright © 2013 Elsevier Inc. All rights reserved.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
NASA Astrophysics Data System (ADS)
Kumlander, Deniss
The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.
Hanne, Janina; Göttfert, Fabian; Schimer, Jiří; Anders-Össwein, Maria; Konvalinka, Jan; Engelhardt, Johann; Müller, Barbara; Hell, Stefan W; Kräusslich, Hans-Georg
2016-09-27
Concomitant with human immunodeficiency virus type 1 (HIV-1) budding from a host cell, cleavage of the structural Gag polyproteins by the viral protease (PR) triggers complete remodeling of virion architecture. This maturation process is essential for virus infectivity. Electron tomography provided structures of immature and mature HIV-1 with a diameter of 120-140 nm, but information about the sequence and dynamics of structural rearrangements is lacking. Here, we employed super-resolution STED (stimulated emission depletion) fluorescence nanoscopy of HIV-1 carrying labeled Gag to visualize the virion architecture. The incomplete Gag lattice of immature virions was clearly distinguishable from the condensed distribution of mature protein subunits. Synchronized activation of PR within purified particles by photocleavage of a caged PR inhibitor enabled time-resolved in situ observation of the induction of proteolysis and maturation by super-resolution microscopy. This study shows the rearrangement of subviral structures in a super-resolution light microscope over time, outwitting phototoxicity and fluorophore bleaching through synchronization of a biological process by an optical switch.
Maeda, Koki; Morioka, Riki; Osada, Takashi
2009-01-01
To control ammonia (NH(3)) volatilization from the dairy cattle (Bos taurus) manure composting process, a compost pile was covered with mature compost and the gas emissions evaluated using the dynamic chamber system. The peak of NH(3) volatilization observed immediately after piling up of the compost was reduced from 196 to 62 mg/m(3) by covering the compost pile with mature compost. The accumulation of NH(4)-N to the covered mature compost was also observed. Covering and mixing the compost with mature compost had no effect on the microbial community structure. However, over time the microbial community structure changed because of a decrease in easily degradable organic compounds in the compost piles. The availability of volatile fatty acids (VFA) was considered to be important for microbial community structure in the compost. After the VFA had disappeared, the NO(3)-N concentration increased and the cellulose degrading bacteria such as Cytophaga increased in number.
ATP Depletion Blocks Herpes Simplex Virus DNA Packaging and Capsid Maturation
Dasgupta, Anindya; Wilson, Duncan W.
1999-01-01
During herpes simplex virus (HSV) assembly, immature procapsids must expel their internal scaffold proteins, transform their outer shell to form mature polyhedrons, and become packaged with the viral double-stranded (ds) DNA genome. A large number of virally encoded proteins are required for successful completion of these events, but their molecular roles are poorly understood. By analogy with the dsDNA bacteriophage we reasoned that HSV DNA packaging might be an ATP-requiring process and tested this hypothesis by adding an ATP depletion cocktail to cells accumulating unpackaged procapsids due to the presence of a temperature-sensitive lesion in the HSV maturational protease UL26. Following return to permissive temperature, HSV capsids were found to be unable to package DNA, suggesting that this process is indeed ATP dependent. Surprisingly, however, the display of epitopes indicative of capsid maturation was also inhibited. We conclude that either formation of these epitopes directly requires ATP or capsid maturation is normally arrested by a proofreading mechanism until DNA packaging has been successfully completed. PMID:9971781
Software process improvement in the NASA software engineering laboratory
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin
1994-01-01
The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.
Literature Review on Systems of Systems (SoS): A Methodology With Preliminary Results
2013-11-01
Appendix H. The Enhanced ISAAC Neural Simulation Toolkit (EINSTein) 73 Appendix I. The Map Aware Nonuniform Automata (MANA) Agent-Based Model 81...83 Figure I-3. Quadrant chart addressing SoS and associated SoSA designs for the Map Aware Nonuniform Automata (MANA) agent...Map Aware Nonuniform Automata (MANA) agent-based model. 85 Table I-2. SoS and SoSA software component maturation scores associated with the Map
2013-11-01
big data with R is relatively new. RHadoop is a mature product from Revolution Analytics that uses R with Hadoop Streaming [15] and provides...agnostic all- data summaries or computations, in which case we use MapReduce directly. 2.3 D&R Software Environment In this work, we use the Hadoop ...job scheduling and tracking, data distribu- tion, system architecture, heterogeneity, and fault-tolerance. Hadoop also provides a distributed key-value
Software development environments: Status and trends
NASA Technical Reports Server (NTRS)
Duffel, Larry E.
1988-01-01
Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.
Lee, Phong D.; Mukherjee, Swati; Edeling, Melissa A.; Dowd, Kimberly A.; Austin, S. Kyle; Manhart, Carolyn J.; Diamond, Michael S.; Fremont, Daved H.
2013-01-01
Flavivirus-infected cells secrete a structurally heterogeneous population of viruses because of an inefficient virion maturation process. Flaviviruses assemble as noninfectious, immature virions composed of trimers of envelope (E) and precursor membrane (prM) protein heterodimers. Cleavage of prM is a required process during virion maturation, although this often remains incomplete for infectious virus particles. Previous work demonstrated that the efficiency of virion maturation could impact antibody neutralization through changes in the accessibility of otherwise cryptic epitopes on the virion. In this study, we show that the neutralization potency of monoclonal antibody (MAb) E33 is sensitive to the maturation state of West Nile virus (WNV), despite its recognition of an accessible epitope, the domain III lateral ridge (DIII-LR). Comprehensive epitope mapping studies with 166 E protein DIII-LR variants revealed that the functional footprint of MAb E33 on the E protein differs subtly from that of the well-characterized DIII-LR MAb E16. Remarkably, aromatic substitutions at E protein residue 306 ablated the maturation state sensitivity of E33 IgG, and the neutralization efficacy of E33 Fab fragments was not affected by changes in the virion maturation state. We propose that E33 IgG binding on mature virions orients the Fc region in a manner that impacts subsequent antibody binding to nearby sites. This Fc-mediated steric constraint is a novel mechanism by which the maturation state of a virion modulates the efficacy of the humoral immune response to flavivirus infection. PMID:24109224
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Contreras-Padilla, Margarita; Gutiérrez-Cortez, Elsa; Valderrama-Bravo, María Del Carmen; Rojas-Molina, Isela; Espinosa-Arbeláez, Diego Germán; Suárez-Vargas, Raúl; Rodríguez-García, Mario Enrique
2012-03-01
Chemical proximate analysis was done in order to determine the changes of nutritional characteristics of nopal powders from three different maturity stages 50, 100, and 150 days and obtained by three different drying processes: freeze dried, forced air oven, and tunnel. Results indicate that nopal powder obtained by the process of freeze dried retains higher contents of protein, soluble fiber, and fat than the other two processes. Also, freeze dried process had less effect on color hue variable. No changes were observed in insoluble fiber content, chroma and lightness with the three different drying processes. Furthermore, the soluble fibers decreased with the age of nopal while insoluble fibers and ash content shows an opposite trend. In addition, the luminosity and hue values did not show differences among the maturity stages studied. The high content of dietary fibers of nopal pad powder could to be an interesting source of these important components for human diets and also could be used in food, cosmetics and pharmaceutical industry.
Compiling software for a hierarchical distributed processing system
Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E
2013-12-31
Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.
Software Defined Radio with Parallelized Software Architecture
NASA Technical Reports Server (NTRS)
Heckler, Greg
2013-01-01
This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.
Software Defined Radio with Parallelized Software Architecture
NASA Technical Reports Server (NTRS)
Heckler, Greg
2013-01-01
This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.
Harnessing ISO/IEC 12207 to Examine the Extent of SPI Activity in an Organisation
NASA Astrophysics Data System (ADS)
Clarke, Paul; O'Connor, Rory
The quality of the software development process directly affects the quality of the software product. To be successful, software development organisations must respond to changes in technology and business circumstances, and therefore software process improvement (SPI) is required. SPI activity relates to any modification that is performed to the software process in order to improve an aspect of the process. Although multiple process assessments could be employed to examine SPI activity, they present an inefficient tool for such an examination. This paper presents an overview of a new survey-based resource that utilises the process reference model in ISO/IEC 12207 in order to expressly and directly determine the level of SPI activity in a software development organisation. This survey instrument can be used by practitioners, auditors and researchers who are interested in determining the extent of SPI activity in an organisation.
Adolescent Judgement As Evidenced In Response To Poetry
ERIC Educational Resources Information Center
Mason, J. S.
1974-01-01
In this article, the nature of adolescent judgement was investigated by means of two components: the maturation of adolescent mental processes and the interaction of poetry in that maturation. (Author/RK)
AVE-SESAME program for the REEDA System
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1981-01-01
The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
USDA-ARS?s Scientific Manuscript database
Two of the important cotton fiber quality and processing parameters are fiber maturity and fineness. Fiber maturity is the degree of development of the fiber’s secondary wall, and fiber fineness is a measure of the fiber’s linear density and can be expressed as mass per unit length. A well-known m...
Calvin E. Meier; John A. Stanturf; Emile S. Gardiner; Paul B. Hamel; Melvin L. Warren
1999-01-01
We report our efforts, initiated in 1995, to quantify ecological processes and functions in a relatively undisturbed, mature hardwood forest. The 320-ha site is located in central Louisiana on the upper reaches of Iatt Creek, an anastomosing minor stream bottom. The forest is a mature sweetgum (Liquidambar styraciflua L.)-cherrybark oak (
Software Development and Test Methodology for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.
Unexpected trend in the compositional maturity of second-cycle sand
Solano-Acosta, W.; Dutta, P.K.
2005-01-01
It is generally accepted that recycling of sandstone generates relatively more mature sand than its parent sandstone. Such maturity is accomplished mainly through chemical weathering as the chemically unstable minerals are eliminated. Because chemical weathering is ubiquitous on the Earth's surface, maturity due to recycling is expected in most geological settings. However, contrary to one's expectation, second-cycle Holocene sand, exclusively derived from sandy facies of the first-cycle Pennsylvanian-Permian Cutler Formation, is actually less mature than its first-cycle parent near Gateway, Colorado. Both the Cutler sandstone and Holocene sand were the products of similar geological processes that controlled their respective composition. In spite of such similarities, a significant difference in composition is observed. We propose that the unexpected immaturity in second-cycle Holocene sand may be due to mechanical disintegration of coarse-grained feldspar and feldspar-rich rock fragments into relatively smaller fractions. Results presented in this paper are the first quantitative estimation of recycling of parent sandstone into daughter sand, and the first observed reverse maturity trend in second-cycle sand. These unexpected results suggest the need for further research to quantitatively understand the recycling process. ?? 2005 Elsevier B.V. All rights reserved.
Improvements to Autoplot's HAPI Support
NASA Astrophysics Data System (ADS)
Faden, J.; Vandegriff, J. D.; Weigel, R. S.
2017-12-01
Autoplot handles data from a variety of data servers. These servers communicate data in different forms, each somewhat different in capabilities and each needing new software to interface. The Heliophysics Application Programmer's Interface (HAPI) attempts to ease this by providing a standard target for clients and servers to meet. Autoplot fully supports reading data from HAPI servers, and support continues to improve as the HAPI server spec matures. This collaboration has already produced robust clients and documentation which would be expensive for groups creating their own protocol. For example, client-side data caching is introduced where Autoplot maintains a cache of data for performance and off-line use. This is a feature we considered for previous data systems, but we could never afford the time to study and implement this carefully. Also, Autoplot itself can be used as a server, making the data it can read and the results of its processing available to other data systems. Autoplot use with other data transmission systems is reviewed as well, outlining features of each system.
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
F-22 cockpit avionics: a systems integration success story
NASA Astrophysics Data System (ADS)
Greeley, Kevin W.; Schwartz, Richard J.
2000-08-01
The F-22 'Raptor' is being developed and manufactured as multi-role fighter aircraft for the 'air dominance' mission. The F-22 team is led by Lockheed Martin, with Boeing and Pratt & Whitney as partners. The F-22 weapons system combines supersonic cruise, maneuverability, stealth, and an extensive suite of tightly integrated sensors to achieve a high level of lethality and invulnerability against current and projected threat systems such as fighter aircraft and surface to air missiles. Despite high automation of the complex systems installed in the F-22, the pilot is heavily tasked for air battle management. Response timelines are compressed due to supersonic cruise velocities. These factors challenge the Pilot Vehicle Interface (PVI) design. This paper discusses the team's response to these challenges, describing the physical cockpit layout, its controls and displays, and the hardware architecture, software tools, and development process used to mature the F-22 'Raptor' weapons system, including a review of Human Factors design considerations for F-22 displays.
Visual Navigation - SARE Mission
NASA Technical Reports Server (NTRS)
Alonso, Roberto; Kuba, Jose; Caruso, Daniel
2007-01-01
The SARE Earth Observing and Technological Mission is part of the Argentinean Space Agency (CONAE - Comision Nacional de Actividades Espaciales) Small and Technological Payloads Program. The Argentinean National Space Program requires from the SARE program mission to test in a real environment of several units, assemblies and components to reduce the risk of using these equipments in more expensive Space Missions. The objective is to make use those components with an acceptable maturity in design or development, but without any heritage at space. From the application point of view, this mission offers new products in the Earth Observation data market which are listed in the present paper. One of the technological payload on board of the SARE satellite is the sensor Ground Tracker. It computes the satellite attitude and orbit in real time (goal) and/or by ground processing. For the first operating mode a dedicated computer and mass memory are necessary to be part of the mentioned sensor. For the second operational mode the hardware and software are much simpler.
Using formal methods for content validation of medical procedure documents.
Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia
2017-08-01
We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Devolites, Jennifer L.; Olansen, Jon B.
2015-01-01
NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing that is designed to serve as a testbed for advanced spacecraft technologies. The lander vehicle, propelled by a Liquid Oxygen (LOX)/Methane engine and sized to carry a 500kg payload to the lunar surface, provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. In 2012, Morpheus began integrating the Autonomous Landing and Hazard Avoidance Technology (ALHAT) sensors and software onto the vehicle in order to demonstrate safe, autonomous landing and hazard avoidance. From the beginning, one of goals for the Morpheus Project was to streamline agency processes and practices. The Morpheus project accepted a challenge to tailor the traditional NASA systems engineering approach in a way that would be appropriate for a lower cost, rapid prototype engineering effort, but retain the essence of the guiding principles. This paper describes the tailored project life cycle and systems engineering approach for the Morpheus project, including the processes, tools, and amount of rigor employed over the project's multiple lifecycles since the project began in fiscal year (FY) 2011.
Spacecraft On-Board Information Extraction Computer (SOBIEC)
NASA Technical Reports Server (NTRS)
Eisenman, David; Decaro, Robert E.; Jurasek, David W.
1994-01-01
The Jet Propulsion Laboratory is the Technical Monitor on an SBIR Program issued for Irvine Sensors Corporation to develop a highly compact, dual use massively parallel processing node known as SOBIEC. SOBIEC couples 3D memory stacking technology provided by nCUBE. The node contains sufficient network Input/Output to implement up to an order-13 binary hypercube. The benefit of this network, is that it scales linearly as more processors are added, and it is a superset of other commonly used interconnect topologies such as: meshes, rings, toroids, and trees. In this manner, a distributed processing network can be easily devised and supported. The SOBIEC node has sufficient memory for most multi-computer applications, and also supports external memory expansion and DMA interfaces. The SOBIEC node is supported by a mature set of software development tools from nCUBE. The nCUBE operating system (OS) provides configuration and operational support for up to 8000 SOBIEC processors in an order-13 binary hypercube or any subset or partition(s) thereof. The OS is UNIX (USL SVR4) compatible, with C, C++, and FORTRAN compilers readily available. A stand-alone development system is also available to support SOBIEC test and integration.
Software engineering processes for Class D missions
NASA Astrophysics Data System (ADS)
Killough, Ronnie; Rose, Debi
2013-09-01
Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).
Caring for a Seriously Ill Child
... medical situation, but also your child's age and maturity level. It's important to know, if possible, what ... process when possible. Depending on their ages and maturity level, visiting the hospital, meeting the nursing and ...
Proceedings of the Fifteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1990-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.
Malheiro, R; Casal, S; Pinheiro, L; Baptista, P; Pereira, J A
2018-02-21
The olive fly, Bactrocera oleae (Rossi) (Diptera: Tephritidae), is a key-pest in the main olives producing areas worldwide, and displays distinct preference to different olive cultivars. The present work intended to study oviposition preference towards three Portuguese cultivars (Cobrançosa, Madural, and Verdeal Transmontana) at different maturation indexes. Multiple oviposition bioassays (multiple-choice and no-choice) were conducted to assess cultivar preference. No-choice bioassays were conducted to assess the influence of different maturation indexes (MI 2; MI 3, and MI 4) in single cultivars. The longevity of olive fly adults according to the cultivar in which its larvae developed was also evaluated through survival assays. Cultivar and maturation are crucial aspects in olive fly preference. Field and laboratory assays revealed a preference towards cv. Verdeal Transmontana olives and a lower susceptibility to cv. Cobrançosa olives. A higher preference was observed for olives at MI 2 and MI 3. The slower maturation process in cv. Verdeal Transmontana (still green while the other cultivars are reddish or at black stage) seems to have an attractive effect on olive fly females, thus increasing its infestation levels. Olive fly adults from both sexes live longer if emerged from pupae developed from cv. Verdeal Transmontana fruits and live less if emerged from cv. Cobrançosa. Therefore, olive cultivar and maturation process are crucial aspects in olive fly preference, also influencing the longevity of adults.
Longitudinal Growth Curves of Brain Function Underlying Inhibitory Control through Adolescence
Foran, William; Velanova, Katerina; Luna, Beatriz
2013-01-01
Neuroimaging studies suggest that developmental improvements in inhibitory control are primarily supported by changes in prefrontal executive function. However, studies are contradictory with respect to how activation in prefrontal regions changes with age, and they have yet to analyze longitudinal data using growth curve modeling, which allows characterization of dynamic processes of developmental change, individual differences in growth trajectories, and variables that predict any interindividual variability in trajectories. In this study, we present growth curves modeled from longitudinal fMRI data collected over 302 visits (across ages 9 to 26 years) from 123 human participants. Brain regions within circuits known to support motor response control, executive control, and error processing (i.e., aspects of inhibitory control) were investigated. Findings revealed distinct developmental trajectories for regions within each circuit and indicated that a hierarchical pattern of maturation of brain activation supports the gradual emergence of adult-like inhibitory control. Mean growth curves of activation in motor response control regions revealed no changes with age, although interindividual variability decreased with development, indicating equifinality with maturity. Activation in certain executive control regions decreased with age until adolescence, and variability was stable across development. Error-processing activation in the dorsal anterior cingulate cortex showed continued increases into adulthood and no significant interindividual variability across development, and was uniquely associated with task performance. These findings provide evidence that continued maturation of error-processing abilities supports the protracted development of inhibitory control over adolescence, while motor response control regions provide early-maturing foundational capacities and suggest that some executive control regions may buttress immature networks as error processing continues to mature. PMID:24227721
Software Design Methodology Migration for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .
Imaging Chromosome Separation in Mouse Oocytes by Responsive 3D Confocal Timelapse Microscopy.
Lane, Simon I R; Crouch, Stephen; Jones, Keith T
2017-01-01
Accurate chromosome segregation is necessary so that genetic material is equally shared among daughter cells. However, maturing mammalian oocytes are particularly prone to chromosome segregation errors, making them a valuable tool for identifying the causes of mis-segregation. Factors such as aging, cohesion loss, DNA damage, and the roles of a plethora of kinetochore and cell cycle-related proteins are involved. To study chromosome segregation in oocytes in a live setting is an imaging challenge that requires advanced techniques. Here we describe a method for examining chromosomes in live oocytes in detail as they undergo maturation. Our method is based on tracking the "center of brightness" of fluorescently labeled chromosomes. Here we describe how to set up our software and run experiments on a Leica TCS SP8 confocal microscope, but the method would be transferable to other microscopes with computer-aided microscopy.
Furtado-Junior, I; Abrunhosa, F A; Holanda, F C A F; Tavares, M C S
2016-06-01
Fishing selectivity of the mangrove crab Ucides cordatus in the north coast of Brazil can be defined as the fisherman's ability to capture and select individuals from a certain size or sex (or a combination of these factors) which suggests an empirical selectivity. Considering this hypothesis, we calculated the selectivity curves for males and females crabs using the logit function of the logistic model in the formulation. The Bayesian inference consisted of obtaining the posterior distribution by applying the Markov chain Monte Carlo (MCMC) method to software R using the OpenBUGS, BRugs, and R2WinBUGS libraries. The estimated results of width average carapace selection for males and females compared with previous studies reporting the average width of the carapace of sexual maturity allow us to confirm the hypothesis that most mature individuals do not suffer from fishing pressure; thus, ensuring their sustainability.
Byun, Bo-Ram; Kim, Yong-Il; Yamaguchi, Tetsutaro; Maki, Koutaro; Son, Woo-Sung
2015-01-01
This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6-18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R (2) had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status.
Proteomic changes during intestinal cell maturation in vivo
Chang, Jinsook; Chance, Mark R.; Nicholas, Courtney; Ahmed, Naseem; Guilmeau, Sandra; Flandez, Marta; Wang, Donghai; Byun, Do-Sun; Nasser, Shannon; Albanese, Joseph M.; Corner, Georgia A.; Heerdt, Barbara G.; Wilson, Andrew J.; Augenlicht, Leonard H.; Mariadason, John M.
2008-01-01
Intestinal epithelial cells undergo progressive cell maturation as they migrate along the crypt-villus axis. To determine molecular signatures that define this process, proteins differentially expressed between the crypt and villus were identified by 2D-DIGE and MALDI-MS. Forty-six differentially expressed proteins were identified, several of which were validated by immunohistochemistry. Proteins upregulated in the villus were enriched for those involved in brush border assembly and lipid uptake, established features of differentiated intestinal epithelial cells. Multiple proteins involved in glycolysis were also upregulated in the villus, suggesting increased glycolysis is a feature of intestinal cell differentiation. Conversely, proteins involved in nucleotide metabolism, and protein processing and folding were increased in the crypt, consistent with functions associated with cell proliferation. Three novel paneth cell markers, AGR2, HSPA5 and RRBP1 were also identified. Notably, significant correlation was observed between overall proteomic changes and corresponding gene expression changes along the crypt-villus axis, indicating intestinal cell maturation is primarily regulated at the transcriptional level. This proteomic profiling analysis identified several novel proteins and functional processes differentially induced during intestinal cell maturation in vivo. Integration of proteomic, immunohistochemical, and parallel gene expression datasets demonstrate the coordinated manner in which intestinal cell maturation is regulated. PMID:18824147
Fidelity of metal insertion into hydrogenases.
Magalon, A; Blokesch, M; Zehelein, E; Böck, A
2001-06-15
The fidelity of metal incorporation into the active center of hydrogenase 3 from Escherichia coli was studied by analyzing the inhibition of the maturation pathway by zinc and other transition metals. Hydrogenase maturation of wild-type cells was significantly affected only by concentrations of zinc or cadmium higher than 200 microM, whereas a mutant with a lesion in the nickel uptake system displayed a total blockade of the proteolytic processing of the precursor form into the mature form of the large subunit after growth in the presence of 10 microM Zn(2+). The precursor could not be processed in vitro by the maturation endopeptidase even in the presence of an excess of nickel ions. Evidence is presented that zinc does not interfere with the incorporation of iron into the metal center. Precursor of the large subunit accumulated in nickel proficient cells formed a transient substrate complex with the cognate endoprotease HycI whereas that of zinc-supplemented cells did not. The results show that zinc can intrude the nickel-dependent maturation pathway only when nickel uptake is blocked. Under this condition zinc appears to be incorporated at the nickel site of the large subunit and delivers a precursor not amenable to proteolytic processing since the interaction with the endoprotease is blocked.
Transcriptome and Small RNA Deep Sequencing Reveals Deregulation of miRNA Biogenesis in Human Glioma
Moore, Lynette M.; Kivinen, Virpi; Liu, Yuexin; Annala, Matti; Cogdell, David; Liu, Xiuping; Liu, Chang-Gong; Sawaya, Raymond; Yli-Harja, Olli; Shmulevich, Ilya; Fuller, Gregory N.; Zhang, Wei; Nykter, Matti
2013-01-01
Altered expression of oncogenic and tumor-suppressing microRNAs (miRNAs) is widely associated with tumorigenesis. However, the regulatory mechanisms underlying these alterations are poorly understood. We sought to shed light on the deregulation of miRNA biogenesis promoting the aberrant miRNA expression profiles identified in these tumors. Using sequencing technology to perform both whole-transcriptome and small RNA sequencing of glioma patient samples, we examined precursor and mature miRNAs to directly evaluate the miRNA maturation process, and interrogated expression profiles for genes involved in the major steps of miRNA biogenesis. We found that ratios of mature to precursor forms of a large number of miRNAs increased with the progression from normal brain to low-grade and then to high-grade gliomas. The expression levels of genes involved in each of the three major steps of miRNA biogenesis (nuclear processing, nucleo-cytoplasmic transport, and cytoplasmic processing) were systematically altered in glioma tissues. Survival analysis of an independent data set demonstrated that the alteration of genes involved in miRNA maturation correlates with survival in glioma patients. Direct quantification of miRNA maturation with deep sequencing demonstrated that deregulation of the miRNA biogenesis pathway is a hallmark for glioma genesis and progression. PMID:23007860
Effective Software Engineering Leadership for Development Programs
ERIC Educational Resources Information Center
Cagle West, Marsha
2010-01-01
Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Oviedo-Ocaña, E R; Torres-Lozada, P; Marmolejo-Rebellon, L F; Hoyos, L V; Gonzales, S; Barrena, R; Komilis, D; Sanchez, A
2015-10-01
Stability and maturity are important criteria to guarantee the quality of a compost that is applied to agriculture or used as amendment in degraded soils. Although different techniques exist to evaluate stability and maturity, the application of laboratory tests in municipalities in developing countries can be limited due to cost and application complexities. In the composting facilities of such places, some classical low cost on-site tests to monitor the composting process are usually implemented; however, such tests do not necessarily clearly identify conditions of stability and maturity. In this article, we have applied and compared results of stability and maturity tests that can be easily employed on site (i.e. temperature, pH, moisture, electrical conductivity [EC], odor and color), and of tests that require more complex laboratory techniques (volatile solids, C/N ratio, self-heating, respirometric index, germination index [GI]). The evaluation of the above was performed in the field scale using 2 piles of biowaste applied compost. The monitoring period was from day 70 to day 190 of the process. Results showed that the low-cost tests traditionally employed to monitor the composting process on-site, such as temperature, color and moisture, do not provide consistent determinations with the more complex laboratory tests used to assess stability (e.g. respiration index, self-heating, volatile solids). In the case of maturity tests (GI, pH, EC), both the on-site tests (pH, EC) and the laboratory test (GI) provided consistent results. Although, stability was indicated for most of the samples, the maturity tests indicated that products were consistently immature. Thus, a stable product is not necessarily mature. Conclusively, the decision on the quality of the compost in the installations located in developing countries requires the simultaneous use of a combination of tests that are performed both in the laboratory and on-site. Copyright © 2015 Elsevier Ltd. All rights reserved.
Boccaletto, Pietro; Siddique, Mohammad Abdul Momin; Cosson, Jacky
2018-05-01
Proteomics techniques, such as two-dimensional polyacrylamide gel electrophoresis, mass spectrometry, and differential gel electrophoresis, have been extensively used to describe the protein composition of male gametes in different animals, mainly mammals. They have also provided a deeper understanding of protein functions involved in sperm processes, as in processes that in humans lead to male infertility. However, few studies focus on fish sperm proteomics and even fewer have tried to explore the proteomic profile of Sturgeon spermatozoa. Sturgeon is an endangered, ancient group of fish species exploited mostly for caviar. In this fish group, a part of the process that leads to final functional maturation of spermatozoa so as to have the capability to activate eggs during the fertilization process. This process has a broad similarity to post-testicular maturation in mammals; where spermatozoa leaving the testes must be mixed with seminal fluid along the transit through the Wolffian ducts to modify its surface membrane protein composition, leading to axonemal and acrosomal competence. The aim of this study was to review the current literature on various proteomic techniques, their usefulness in separating, identifying and studying the proteome composition of the fish spermatozoon, as well as their potential applications in studying the post-testicular maturation process in Sturgeon. Such understanding could lead to development of more sophisticated aquaculture techniques, favorable for sturgeon reproduction. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bouty, A. A.; Koniyo, M. H.; Novian, D.
2018-02-01
This study aims to determine the level of maturity of information technology governance in Gorontalo city government by applying the COBIT framework 4.1. The research method is the case study method, by conducting surveys and data collection at 25 institution in Gorontalo City. The results of this study is the analysis of information technology needs based on the measurement of maturity level. The results of the measurement of the maturity level of information technology governance shows that there are still many business processes running at lower level, from 9 existing business processes there are 4 processes at level 2 (repetitive but intuitive) and 3 processes at level 1 (Initial/Ad hoc). With these results, is expected that the government of Gorontalo city immediately make improvements to the governance of information technology so that it can run more effectively and efficiently.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
Riepsaame, Joey; van Oudenaren, Adri; den Broeder, Berlinda J. H.; van IJcken, Wilfred F. J.; Pothof, Joris; Leenen, Pieter J. M.
2013-01-01
Dendritic cell (DC) maturation is a tightly regulated process that requires coordinated and timed developmental cues. Here we investigate whether microRNAs are involved in this process. We identify microRNAs in mouse GM-CSF-generated, monocyte-related DC (GM-DC) that are differentially expressed during both spontaneous and LPS-induced maturation and characterize M-CSF receptor (M-CSFR), encoded by the Csf1r gene, as a key target for microRNA-mediated regulation in the final step toward mature DC. MicroRNA-22, -34a, and -155 are up-regulated in mature MHCIIhi CD86hi DC and mediate Csf1r mRNA and protein down-regulation. Experimental inhibition of Csf1r-targeting microRNAs in vitro results not only in sustained high level M-CSFR protein expression but also in impaired DC maturation upon stimulation by LPS. Accordingly, over-expression of Csf1r in GM-DC inhibits terminal differentiation. Taken together, these results show that developmentally regulated microRNAs control Csf1r expression, supplementing previously identified mechanisms that regulate its transcription and protein surface expression. Furthermore, our data indicate a novel function for Csf1r in mouse monocyte-derived DC, showing that down-regulation of M-CSFR expression is essential for final DC maturation. PMID:24198819
Studying the Accuracy of Software Process Elicitation: The User Articulated Model
ERIC Educational Resources Information Center
Crabtree, Carlton A.
2010-01-01
Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…
Liu, Li; Bastien, Nathalie; Li, Yan
2007-12-01
The biosynthesis and posttranslational processing of human metapneumovirus attachment G glycoprotein were investigated. After pulse-labeling, the G protein accumulated as three species with molecular weights of 45,000, 50,000, and 53,000 (45K, 50K, and 53K, respectively). N-Glycosidase digestion indicated that these forms represent the unglycosylated precursor and N-glycosylated intermediate products, respectively. After an appropriate chase, these three naive forms were further processed to a mature 97K form. The presence of O-linked sugars in mature G protein was confirmed by O-glycanase digestion and lectin-binding assay using Arachis hypogaea (peanut agglutinin), an O-glycan-specific lectin. In addition, in the O-glycosylation-deficient cell line (CHO ldlD cell), the G protein could not be processed to the mature form unless the exogenous Gal and GalNAc were supplemented, which provided added evidence supporting the O-linked glycosylation of G protein. The maturation of G was completely blocked by monensin but was partially sensitive to brefeldin A (BFA), suggesting the O-linked glycosylation of G initiated in the trans-Golgi compartment and terminated in the trans-Golgi network. Enzymatic deglycosylation analysis confirmed that the BFA-G was a partial mature form containing N-linked oligosaccharides and various amounts of O-linked carbohydrate side chains. The expression of G protein at the cell surface could be detected by indirect immunofluorescence staining assay. Furthermore, cell surface immunoprecipitation displayed an efficient intracellular transport of G protein.
Lessard, Julie; Pelletier, Mélissa; Biertho, Laurent; Biron, Simon; Marceau, Simon; Hould, Frédéric-Simon; Lebel, Stéfane; Moustarah, Fady; Lescelleur, Odette; Marceau, Picard; Tchernof, André
2015-01-01
Mature adipocytes can reverse their phenotype to become fibroblast-like cells. This is achieved by ceiling culture and the resulting cells, called dedifferentiated fat (DFAT) cells, are multipotent. Beyond the potential value of these cells for regenerative medicine, the dedifferentiation process itself raises many questions about cellular plasticity and the pathways implicated in cell behavior. This work has been performed with the objective of obtaining new information on adipocyte dedifferentiation, especially pertaining to new targets that may be involved in cellular fate changes. To do so, omental and subcutaneous mature adipocytes sampled from severely obese subjects have been dedifferentiated by ceiling culture. An experimental design with various time points along the dedifferentiation process has been utilized to better understand this process. Cell size, gene and protein expression as well as cytokine secretion were investigated. Il-6, IL-8, SerpinE1 and VEGF secretion were increased during dedifferentiation, whereas MIF-1 secretion was transiently increased. A marked decrease in expression of mature adipocyte transcripts (PPARγ2, C/EBPα, LPL and Adiponectin) was detected early in the process. In addition, some matrix remodeling transcripts (FAP, DPP4, MMP1 and TGFβ1) were rapidly and strongly up-regulated. FAP and DPP4 proteins were simultaneously induced in dedifferentiating mature adipocytes supporting a potential role for these enzymes in adipose tissue remodeling and cell plasticity. PMID:25816202
USDA-ARS?s Scientific Manuscript database
Mature green banana (Musa sapientum L. cv. Cavendish) fruit were stored in 0.5%, 2 %, or 21% O2 for 7 days at 20 °C before ripening was initiated by ethylene. Residual effects of low O2 storage in mature green fruit on ripening and ester biosynthesis in fruit were investigated during ripening period...
Apical closure of mature molar roots with the use of calcium hydroxide.
Rotstein, I; Friedman, S; Katz, J
1990-11-01
Calcium hydroxide may induce apical root closure in affected mature teeth as well as in immature teeth. Once an apical hard tissue barrier is formed, a permanent root canal filling can be safely condensed. Two cases are described in which calcium hydroxide induced apical root closure in mature molar teeth where the apical constriction was lost because of chronic inflammatory process.
A Developmental Model of Cross-Cultural Competence at the Tactical Level
2010-11-01
components of 3C and describe how 3C develops in Soldiers. Five components of 3C were identified: Cultural Maturity , Cognitive Flexibility, Cultural...a result of the data analysis: Cultural Maturity , Cognitive Flexibility, Cultural Knowledge, Cultural Acuity, and Interpersonal Skills. These five...create regressions in the 3C development process. In short, KSAAs mature interdependently and simultaneously. Thus, development and transitions across
Rime, Hélène; Nguyen, Thaovi; Ombredane, Kevin; Fostier, Alexis; Bobe, Julien
2015-07-01
In the present study, we aimed at characterizing the effect of cyproterone acetate (CPA), an anti-androgenic compound, on oocyte meiotic maturation in a freshwater teleost fish species, the rainbow trout (Oncorhynchus mykiss). Fully-grown post-vitellogenic ovarian follicles were incubated in vitro with CPA, luteinizing hormone (Lh) or a combination of CPA and Lh. Incubations were also performed using a combination of Lh and testosterone (T). The occurrence of oocyte maturation (i.e., resumption of the meiotic process) was assessed by monitoring germinal vesicle breakdown (GVBD) after a 72h in vitro incubation. The effect of CPA on the production of 17,20β-dihydroxy-4-pregnen-3-one (17,20βP), the natural maturation-inducing steroid (MIS), was quantified by radioimmunoassay. Our results show that CPA dramatically inhibits Lh-induced oocyte maturation and MIS synthesis. We also observed a synergistic effect of Lh and T on oocyte maturation in highly competent oocytes (i.e., able to resume meiosis after stimulation by low doses of Lh). Our results also show that a combination of CPA and Lh inhibits phosphorylation of extracellular signal-regulated kinase (Erk), kinases that are associated with oocyte maturation in many species. As a whole, our results indicate that CPA has a potential to alter meiotic maturation in rainbow trout. Further analyses are, however, needed to determine the mechanisms by which this anti-androgen interferes with the meiotic process. Furthermore, the present study provides a framework for better understanding of the ecological consequences of exposure to anti-androgens and resulting meiotic maturation abnormalities observed in trout. Copyright © 2015 Elsevier B.V. All rights reserved.
Wang, Chunliang; Ritter, Felix; Smedby, Orjan
2010-07-01
To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added <10ms of processing time while the other IPC methods cost 1-5 s in our experiments. The browser-server style communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.
Software And Systems Engineering Risk Management
2010-04-01
RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software
Platform-independent software for medical image processing on the Internet
NASA Astrophysics Data System (ADS)
Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin
1997-05-01
We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models
NASA Astrophysics Data System (ADS)
Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto
In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.
NASA Technical Reports Server (NTRS)
1973-01-01
A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
TMT approach to observatory software development process
NASA Astrophysics Data System (ADS)
Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder
2016-07-01
The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.
CoLiTec software - detection of the near-zero apparent motion
NASA Astrophysics Data System (ADS)
Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.
2017-06-01
In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.
Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process
NASA Technical Reports Server (NTRS)
McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.
1999-01-01
This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.
RADS Version 4: An Efficient Way to Analyse the Multi-Mission Altimeter Database
NASA Astrophysics Data System (ADS)
Scharroo, Remko; Leuliette, Eric; Naeije, Marc; Martin-Puig, Cristina; Pires, Nelson
2016-08-01
The Radar Altimeter Database System (RADS) has grown to become a mature altimeter database. Over the last 18 years it is continuously being developed, first at Delft University of Technology, now also at the National Oceanic and Atmospheric Administration (NOAA) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT).RADS now serves as a fundamental Climate Data Record for sea level. Because of the multiple users involved in vetting the data and the regular updates to the database, RADS is one of the most accurate and complete databases of satellite altimeter data around.RADS version 4 is a major change from the previous version. While the database is compatible with both software versions, the new software provides new tools, allows easier expansion, and has a better and more standardised interface.
NASA Technical Reports Server (NTRS)
Hancock, David W., III
1999-01-01
This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.
MOPEX: a software package for astronomical image processing and visualization
NASA Astrophysics Data System (ADS)
Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley
2006-06-01
We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.
The AGU Data Management Maturity Model Initiative
NASA Astrophysics Data System (ADS)
Bates, J. J.
2015-12-01
In September 2014, the AGU Board of Directors approved two initiatives to help the Earth and space sciences community address the growing challenges accompanying the increasing size and complexity of data. These initiatives are: 1) Data Science Credentialing: development of a continuing education and professional certification program to help scientists in their careers and to meet growing responsibilities and requirements around data science; and 2) Data Management Maturity (DMM) Model: development and implementation of a data management maturity model to assess process maturity against best practices, and to identify opportunities in organizational data management processes. Each of these has been organized within AGU as an Editorial Board and both Boards have held kick off meetings. The DMM model Editorial Board will recommend strategies for adapting and deploying a DMM model to the Earth and space sciences create guidance documents to assist in its implementation, and provide input on a pilot appraisal process. This presentation will provide an overview of progress to date in the DMM model Editorial Board and plans for work to be done over the upcoming year.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Protease-Mediated Maturation of HIV: Inhibitors of Protease and the Maturation Process.
Adamson, Catherine S
2012-01-01
Protease-mediated maturation of HIV-1 virus particles is essential for virus infectivity. Maturation occurs concomitant with immature virus particle release and is mediated by the viral protease (PR), which sequentially cleaves the Gag and Gag-Pol polyproteins into mature protein domains. Maturation triggers a second assembly event that generates a condensed conical capsid core. The capsid core organizes the viral RNA genome and viral proteins to facilitate viral replication in the next round of infection. The fundamental role of proteolytic maturation in the generation of mature infectious particles has made it an attractive target for therapeutic intervention. Development of small molecules that target the PR active site has been highly successful and nine protease inhibitors (PIs) have been approved for clinical use. This paper provides an overview of their development and clinical use together with a discussion of problems associated with drug resistance. The second-half of the paper discusses a novel class of antiretroviral drug termed maturation inhibitors, which target cleavage sites in Gag not PR itself. The paper focuses on bevirimat (BVM) the first-in-class maturation inhibitor: its mechanism of action and the implications of naturally occurring polymorphisms that confer reduced susceptibility to BVM in phase II clinical trials.
Maturity Assessment of Space Plug-and-Play Architecture
2013-03-01
SSM SPA Service Module SRL System Readiness Level TAT Time-at-Tone TRA Technology Readiness Assessment TRL Technology Readiness Level USB Universal...maturity assessment—the Technology Readiness Level (TRL) process, the Integration Readiness Level (IRL) process, and the System Readiness Level ( SRL ...is an important hallmark of the SPA concept, and makes possible the composability and scalability of system designs that employ it. 14 4. SPA
Gan, Lin; Denecke, Bernd
2013-01-01
Mature microRNA is a crucial component in the gene expression regulation network. At the same time, microRNA gene expression and procession is regulated in a precise and collaborated way. Pre-microRNAs mediate products during the microRNA transcription process, they can provide hints of microRNA gene expression regulation or can serve as alternative biomarkers. To date, little effort has been devoted to pre-microRNA expression profiling. In this study, three human and three mouse microRNA profile data sets, based on the Affymetrix miRNA 2.0 array, have been re-analyzed for both mature and pre-microRNA signals as a primary test of parallel mature/pre-microRNA expression profiling on a single platform. The results not only demonstrated a glimpse of pre-microRNA expression in human and mouse, but also the relationship of microRNA expressions between pre- and mature forms. The study also showed a possible application of currently available microRNA microarrays in profiling pre-microRNA expression in a time and cost effective manner. PMID:27605179
Proteomic analysis of mature and immature ejaculated spermatozoa from fertile men
Cui, Zhihong; Sharma, Rakesh; Agarwal, Ashok
2016-01-01
Dysfunctional spermatozoa maturation is the main reason for the decrease in sperm motility and morphology in infertile men. Ejaculated spermatozoa from healthy fertile men were separated into four fractions using three-layer density gradient. Proteins were extracted and bands were digested on a LTQ-Orbitrap Elite hybrid mass spectrometer system. Functional annotations of proteins were obtained using bioinformatics tools and pathway databases. Western blotting was performed to verify the expression levels of the proteins of interest. 1469 proteins were identified in four fractions of spermatozoa. The number of detected proteins decreased according to the maturation level of spermatozoa. During spermatozoa maturation, proteins involved in gamete generation, cell motility, energy metabolism and oxidative phosphorylation processes showed increasing expression levels and those involved in protein biosynthesis, protein transport, protein ubiquitination, and response to oxidative stress processes showed decreasing expression levels. We validated four proteins (HSP 70 1A, clusterin, tektin 2 and tektin 3) by Western blotting. The study shows protein markers that may provide insight into the ejaculated spermatozoa proteins in different stages of sperm maturation that may be altered or modified in infertile men. PMID:26510506
Centriole maturation requires regulated Plk1 activity during two consecutive cell cycles.
Kong, Dong; Farmer, Veronica; Shukla, Anil; James, Jana; Gruskin, Richard; Kiriyama, Shigeo; Loncarek, Jadranka
2014-09-29
Newly formed centrioles in cycling cells undergo a maturation process that is almost two cell cycles long before they become competent to function as microtubule-organizing centers and basal bodies. As a result, each cell contains three generations of centrioles, only one of which is able to form cilia. It is not known how this long and complex process is regulated. We show that controlled Plk1 activity is required for gradual biochemical and structural maturation of the centrioles and timely appendage assembly. Inhibition of Plk1 impeded accumulation of appendage proteins and appendage formation. Unscheduled Plk1 activity, either in cycling or interphase-arrested cells, accelerated centriole maturation and appendage and cilia formation on the nascent centrioles, erasing the age difference between centrioles in one cell. These findings provide a new understanding of how the centriole cycle is regulated and how proper cilia and centrosome numbers are maintained in the cells.
Centriole maturation requires regulated Plk1 activity during two consecutive cell cycles
Kong, Dong; Farmer, Veronica; Shukla, Anil; James, Jana; Gruskin, Richard; Kiriyama, Shigeo
2014-01-01
Newly formed centrioles in cycling cells undergo a maturation process that is almost two cell cycles long before they become competent to function as microtubule-organizing centers and basal bodies. As a result, each cell contains three generations of centrioles, only one of which is able to form cilia. It is not known how this long and complex process is regulated. We show that controlled Plk1 activity is required for gradual biochemical and structural maturation of the centrioles and timely appendage assembly. Inhibition of Plk1 impeded accumulation of appendage proteins and appendage formation. Unscheduled Plk1 activity, either in cycling or interphase-arrested cells, accelerated centriole maturation and appendage and cilia formation on the nascent centrioles, erasing the age difference between centrioles in one cell. These findings provide a new understanding of how the centriole cycle is regulated and how proper cilia and centrosome numbers are maintained in the cells. PMID:25246616
SEL's Software Process-Improvement Program
NASA Technical Reports Server (NTRS)
Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose
1995-01-01
The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.
Gutierrez, Alejandro P; Yáñez, José M; Fukui, Steve; Swift, Bruce; Davidson, William S
2015-01-01
Early sexual maturation is considered a serious drawback for Atlantic salmon aquaculture as it retards growth, increases production times and affects flesh quality. Although both growth and sexual maturation are thought to be complex processes controlled by several genetic and environmental factors, selection for these traits has been continuously accomplished since the beginning of Atlantic salmon selective breeding programs. In this genome-wide association study (GWAS) we used a 6.5K single-nucleotide polymorphism (SNP) array to genotype ∼ 480 individuals from the Cermaq Canada broodstock program and search for SNPs associated with growth and age at sexual maturation. Using a mixed model approach we identified markers showing a significant association with growth, grilsing (early sexual maturation) and late sexual maturation. The most significant associations were found for grilsing, with markers located in Ssa10, Ssa02, Ssa13, Ssa25 and Ssa12, and for late maturation with markers located in Ssa28, Ssa01 and Ssa21. A lower level of association was detected with growth on Ssa13. Candidate genes, which were linked to these genetic markers, were identified and some of them show a direct relationship with developmental processes, especially for those in association with sexual maturation. However, the relatively low power to detect genetic markers associated with growth (days to 5 kg) in this GWAS indicates the need to use a higher density SNP array in order to overcome the low levels of linkage disequilibrium observed in Atlantic salmon before the information can be incorporated into a selective breeding program.
Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects
ERIC Educational Resources Information Center
Buffardi, Kevin John
2014-01-01
Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
ISAC's Gating-ML 2.0 data exchange standard for gating description.
Spidlen, Josef; Moore, Wayne; Brinkman, Ryan R
2015-07-01
The lack of software interoperability with respect to gating has traditionally been a bottleneck preventing the use of multiple analytical tools and reproducibility of flow cytometry data analysis by independent parties. To address this issue, ISAC developed Gating-ML, a computer file format to encode and interchange gates. Gating-ML 1.5 was adopted and published as an ISAC Candidate Recommendation in 2008. Feedback during the probationary period from implementors, including major commercial software companies, instrument vendors, and the wider community, has led to a streamlined Gating-ML 2.0. Gating-ML has been significantly simplified and therefore easier to support by software tools. To aid developers, free, open source reference implementations, compliance tests, and detailed examples are provided to stimulate further commercial adoption. ISAC has approved Gating-ML as a standard ready for deployment in the public domain and encourages its support within the community as it is at a mature stage of development having undergone extensive review and testing, under both theoretical and practical conditions. © 2015 International Society for Advancement of Cytometry.
Process Acceptance and Adoption by IT Software Project Practitioners
ERIC Educational Resources Information Center
Guardado, Deana R.
2012-01-01
This study addresses the question of what factors determine acceptance and adoption of processes in the context of Information Technology (IT) software development projects. This specific context was selected because processes required for managing software development projects are less prescriptive than in other, more straightforward, IT…
Semantic Service Matchmaking in the ATM Domain Considering Infrastructure Capability Constraints
NASA Astrophysics Data System (ADS)
Moser, Thomas; Mordinyi, Richard; Sunindyo, Wikan Danar; Biffl, Stefan
In a service-oriented environment business processes flexibly build on software services provided by systems in a network. A key design challenge is the semantic matchmaking of business processes and software services in two steps: 1. Find for one business process the software services that meet or exceed the BP requirements; 2. Find for all business processes the software services that can be implemented within the capability constraints of the underlying network, which poses a major problem since even for small scenarios the solution space is typically very large. In this chapter we analyze requirements from mission-critical business processes in the Air Traffic Management (ATM) domain and introduce an approach for semi-automatic semantic matchmaking for software services, the “System-Wide Information Sharing” (SWIS) business process integration framework. A tool-supported semantic matchmaking process like SWIS can provide system designers and integrators with a set of promising software service candidates and therefore strongly reduces the human matching effort by focusing on a much smaller space of matchmaking candidates. We evaluate the feasibility of the SWIS approach in an industry use case from the ATM domain.
Ramos, Inés; Cisint, Susana B; Crespo, Claudia A; Medina, Marcela F; Fernández, Silvia N
2009-08-01
The localization of calcium and Ca-ATPase activity in Bufo arenarum oocytes was investigated by ultracytochemical techniques during progesterone-induced nuclear maturation, under in vitro conditions. No Ca2+ deposits were detected in either control oocytes or progesterone-treated ones for 1-2 h. At the time when nuclear migration started, electron dense deposits of Ca2+ were visible in vesicles, endoplasmic reticulum cisternae and in the space between the annulate lamellae membranes. Furthermore, Ca-ATPase activity was also detected in these membrane structures. As maturation progressed, the cation deposits were observed in the cytomembrane structures, which underwent an important reorganization and redistribution. Thus, they moved from the subcortex and became located predominantly in the oocyte cortex area when nuclear maturation ended. Ca2+ stores were observed in vesicles surrounding or between the cortical granules, which are aligned close to the plasma membrane. The positive Ca-ATPase reaction in these membrane structures could indicate that the calcium deposit is an ATP-dependent process. Our results suggest that during oocyte maturation calcium would be stored in membrane structures where it remains available for release at the time of fertilization. Data obtained under our experimental conditions indicate that calcium from the extracellular medium would be important for the oocyte maturation process.
The integration of the risk management process with the lifecycle of medical device software.
Pecoraro, F; Luzi, D
2014-01-01
The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.
Evolving software reengineering technology for the emerging innovative-competitive era
NASA Technical Reports Server (NTRS)
Hwang, Phillip Q.; Lock, Evan; Prywes, Noah
1994-01-01
This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
Software for Demonstration of Features of Chain Polymerization Processes
ERIC Educational Resources Information Center
Sosnowski, Stanislaw
2013-01-01
Free software for the demonstration of the features of homo- and copolymerization processes (free radical, controlled radical, and living) is described. The software is based on the Monte Carlo algorithms and offers insight into the kinetics, molecular weight distribution, and microstructure of the macromolecules formed in those processes. It also…
The Personal Software Process: Downscaling the factory
NASA Technical Reports Server (NTRS)
Roy, Daniel M.
1994-01-01
It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.
SEPAC software configuration control plan and procedures, revision 1
NASA Technical Reports Server (NTRS)
1981-01-01
SEPAC Software Configuration Control Plan and Procedures are presented. The objective of the software configuration control is to establish the process for maintaining configuration control of the SEPAC software beginning with the baselining of SEPAC Flight Software Version 1 and encompass the integration and verification tests through Spacelab Level IV Integration. They are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of SEPAC software.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pager, Cara Theresia; Craft, Willie Warren; Patch, Jared
2006-03-15
The Nipah virus fusion (F) protein is proteolytically processed to F{sub 1} + F{sub 2} subunits. We demonstrate here that cathepsin L is involved in this important maturation event. Cathepsin inhibitors ablated cleavage of Nipah F. Proteolytic processing of Nipah F and fusion activity was dramatically reduced in cathepsin L shRNA-expressing Vero cells. Additionally, Nipah virus F-mediated fusion was inhibited in cathepsin L-deficient cells, but coexpression of cathepsin L restored fusion activity. Both purified cathepsin L and B could cleave immunopurified Nipah F protein, but only cathepsin L produced products of the correct size. Our results suggest that endosomal cathepsinsmore » can cleave Nipah F, but that cathepsin L specifically converts Nipah F to a mature and fusogenic form.« less
Impact of Growing Business on Software Processes
NASA Astrophysics Data System (ADS)
Nikitina, Natalja; Kajko-Mattsson, Mira
When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.
A Recommended Framework for the Network-Centric Acquisition Process
2009-09-01
ISO /IEC 12207 , Systems and Software Engineering-Software Life-Cycle Processes ANSI/EIA 632, Processes for Engineering a System. There are...engineering [46]. Some of the process models presented in the DAG are: ISO /IEC 15288, Systems and Software Engineering-System Life-Cycle Processes...e.g., ISO , IA, Security, etc.). Vetting developers helps ensure that they are using industry best industry practices and maximize the IA compliance
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
Software Configuration Management Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.
GeneFisher-P: variations of GeneFisher as processes in Bio-jETI
Lamprecht, Anna-Lena; Margaria, Tiziana; Steffen, Bernhard; Sczyrba, Alexander; Hartmeier, Sven; Giegerich, Robert
2008-01-01
Background PCR primer design is an everyday, but not trivial task requiring state-of-the-art software. We describe the popular tool GeneFisher and explain its recent restructuring using workflow techniques. We apply a service-oriented approach to model and implement GeneFisher-P, a process-based version of the GeneFisher web application, as a part of the Bio-jETI platform for service modeling and execution. We show how to introduce a flexible process layer to meet the growing demand for improved user-friendliness and flexibility. Results Within Bio-jETI, we model the process using the jABC framework, a mature model-driven, service-oriented process definition platform. We encapsulate remote legacy tools and integrate web services using jETI, an extension of the jABC for seamless integration of remote resources as basic services, ready to be used in the process. Some of the basic services used by GeneFisher are in fact already provided as individual web services at BiBiServ and can be directly accessed. Others are legacy programs, and are made available to Bio-jETI via the jETI technology. The full power of service-based process orientation is required when more bioinformatics tools, available as web services or via jETI, lead to easy extensions or variations of the basic process. This concerns for instance variations of data retrieval or alignment tools as provided by the European Bioinformatics Institute (EBI). Conclusions The resulting service- and process-oriented GeneFisher-P demonstrates how basic services from heterogeneous sources can be easily orchestrated in the Bio-jETI platform and lead to a flexible family of specialized processes tailored to specific tasks. PMID:18460174
Experimentation in software engineering
NASA Technical Reports Server (NTRS)
Basili, V. R.; Selby, R. W.; Hutchens, D. H.
1986-01-01
Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.
2010-06-01
cannot make a distinction between software maintenance and development” (Sharma, 2004). ISO /IEC 12207 Software Lifecycle Processes offers a guide to...synopsis of ISO /IEC 12207 , Raghu Singh of the Federal Aviation Administration states “Whenever a software product needs modifications, the development...Corporation. Singh, R. (1998). International Standard ISO /IEC 12207 Software Life Cycle Processes. Washington: Federal Aviation Administration. The Joint
A Bibliography of the Personal Software Process (PSP) and the Team Software Process (TSP)
2009-10-01
Postmortem.‖ Proceedings of the TSP Symposium (September 2007). http://www.sei.cmu.edu/tspsymposium/ Rickets , Chris; Lindeman, Robert; & Hodgins, Brad... Rickets , Chris A. ―A TSP Software Maintenance Life Cycle.‖ CrossTalk (March 2005). Rozanc, I. & Mahnic, V. ―Teaching Software Quality with Emphasis on PSP
A Process for Evaluating Student Records Management Software. ERIC/AE Digest.
ERIC Educational Resources Information Center
Vecchioli, Lisa
This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…
Cutting edge: Contact with secondary lymphoid organs drives postthymic T cell maturation.
Houston, Evan G; Nechanitzky, Robert; Fink, Pamela J
2008-10-15
T cell development, originally thought to be completed in the thymus, has recently been shown to continue for several weeks in the lymphoid periphery. The forces that drive this peripheral maturation are unclear. The use of mice transgenic for GFP driven by the RAG2 promoter has enabled the ready identification and analysis of recent thymic emigrants. Here, we show that recent thymic emigrant maturation is a progressive process and is promoted by T cell exit from the thymus. Further, we show that this maturation occurs within secondary lymphoid organs and does not require extensive lymphocyte recirculation.
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
The Package-Based Development Process in the Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil
1997-01-01
The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.
Byun, Bo-Ram; Kim, Yong-Il; Maki, Koutaro; Son, Woo-Sung
2015-01-01
This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6–18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R 2 had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status. PMID:25878721
A Stiffness Switch in Human Immunodeficiency Virus
Kol, Nitzan; Shi, Yu; Tsvitov, Marianna; Barlam, David; Shneck, Roni Z.; Kay, Michael S.; Rousso, Itay
2007-01-01
After budding from the cell, human immunodeficiency virus (HIV) and other retrovirus particles undergo a maturation process that is required for their infectivity. During maturation, HIV particles undergo a significant internal morphological reorganization, changing from a roughly spherically symmetric immature particle with a thick protein shell to a mature particle with a thin protein shell and conical core. However, the physical principles underlying viral particle production, maturation, and entry into cells remain poorly understood. Here, using nanoindentation experiments conducted by an atomic force microscope (AFM), we report the mechanical measurements of HIV particles. We find that immature particles are more than 14-fold stiffer than mature particles and that this large difference is primarily mediated by the HIV envelope cytoplasmic tail domain. Finite element simulation shows that for immature virions the average Young's modulus drops more than eightfold when the cytoplasmic tail domain is deleted (930 vs. 115 MPa). We also find a striking correlation between the softening of viruses during maturation and their ability to enter cells, providing the first evidence, to our knowledge, for a prominent role for virus mechanical properties in the infection process. These results show that HIV regulates its mechanical properties at different stages of its life cycle (i.e., stiff during viral budding versus soft during entry) and that this regulation may be important for efficient infectivity. Our report of this maturation-induced “stiffness switch” in HIV establishes the groundwork for mechanistic studies of how retroviral particles can regulate their mechanical properties to affect biological function. PMID:17158573
Bicarbonate Transport During Enamel Maturation.
Yin, Kaifeng; Paine, Michael L
2017-11-01
Amelogenesis (tooth enamel formation) is a biomineralization process consisting primarily of two stages (secretory stage and maturation stage) with unique features. During the secretory stage, the inner epithelium of the enamel organ (i.e., the ameloblast cells) synthesizes and secretes enamel matrix proteins (EMPs) into the enamel space. The protein-rich enamel matrix forms a highly organized architecture in a pH-neutral microenvironment. As amelogenesis transitions to maturation stage, EMPs are degraded and internalized by ameloblasts through endosomal-lysosomal pathways. Enamel crystallite formation is initiated early in the secretory stage, however, during maturation stage the more rapid deposition of calcium and phosphate into the enamel space results in a rapid expansion of crystallite length and mineral volume. During maturation-stage amelogenesis, the pH value of enamel varies considerably from slightly above neutral to acidic. Extracellular acid-base balance during enamel maturation is tightly controlled by ameloblast-mediated regulatory networks, which include significant synthesis and movement of bicarbonate ions from both the enamel papillary layer cells and ameloblasts. In this review we summarize the carbonic anhydrases and the carbonate transporters/exchangers involved in pH regulation in maturation-stage amelogenesis. Proteins that have been shown to be instrumental in this process include CA2, CA6, CFTR, AE2, NBCe1, SLC26A1/SAT1, SLC26A3/DRA, SLC26A4/PDS, SLC26A6/PAT1, and SLC26A7/SUT2. In addition, we discuss the association of miRNA regulation with bicarbonate transport in tooth enamel formation.
Reconstructing life history of hominids and humans.
Crews, Douglas E; Gerber, Linda M
2003-06-01
Aspects of life history, such as processes and timing of development, age at maturation, and life span are consistently associated with one another across the animal kingdom. Species that develop rapidly tend to mature and reproduce early, have many offspring, and exhibit shorter life spans (r-selection) than those that develop slowly, have extended periods of premature growth, mature later in life, reproduce later and less frequently, have few offspring and/or single births, and exhibit extended life spans (K-selection). In general, primates are among the most K-selected of species. A suite of highly derived life history traits characterizes humans. Among these are physically immature neonates, slowed somatic development both in utero and post-natally, late attainment of reproductive maturity and first birth, and extended post-mature survival. Exactly when, why, and through what types of evolutionary interactions this suite arose is currently the subject of much conjecture and debate. Humankind's biocultural adaptations have helped to structure human life history evolution in unique ways not seen in other animal species. Among all species, life history traits may respond rapidly to alterations in selective pressures through hormonal processes. Selective pressures on life history likely varied widely among hominids and humans over their evolutionary history. This suggests that current patterns of human growth, development, maturation, reproduction, and post-mature survival may be of recent genesis, rather then long-standing adaptations. Thus, life history patterns observed among contemporary human and chimpanzee populations may provide little insight to those that existed earlier in hominid/human evolution.
R.K. Dumroese; D.F. Jacobs; T.D. Landis
2005-01-01
Forest regeneration is a cyclic operation. Seeds are collected from mature trees and planted in nurseries so that the resulting seedlings can be outplanted to the forest after the mature trees are harvested. Similarly, the process of deciding upon, and growing, the best seedlings for that site should be a cyclic process between foresters and nursery managers. The ideal...
Cui, Jingqiu; Chen, Wei; Sun, Jinhong; Guo, Huan; Madley, Rachel; Xiong, Yi; Pan, Xingyi; Wang, Hongliang; Tai, Andrew W.; Weiss, Michael A.; Arvan, Peter; Liu, Ming
2015-01-01
Upon translocation across the endoplasmic reticulum (ER) membrane, secretory proteins are proteolytically processed to remove their signal peptide by signal peptidase (SPase). This process is critical for subsequent folding, intracellular trafficking, and maturation of secretory proteins. Prokaryotic SPase has been shown to be a promising antibiotic target. In contrast, to date, no eukaryotic SPase inhibitors have been reported. Here we report that introducing a proline immediately following the natural signal peptide cleavage site not only blocks preprotein cleavage but also, in trans, impairs the processing and maturation of co-expressed preproteins in the ER. Specifically, we find that a variant preproinsulin, pPI-F25P, is translocated across the ER membrane, where it binds to the catalytic SPase subunit SEC11A, inhibiting SPase activity in a dose-dependent manner. Similar findings were obtained with an analogous variant of preproparathyroid hormone, demonstrating that inhibition of the SPase does not depend strictly on the sequence or structure of the downstream mature protein. We further show that inhibiting SPase in the ER impairs intracellular processing of viral polypeptides and their subsequent maturation. These observations suggest that eukaryotic SPases (including the human ortholog) are, in principle, suitable therapeutic targets for antiviral drug design. PMID:26446786
Joseph, Jane E.; Gathers, Ann D.; Bhatt, Ramesh S.
2010-01-01
Face processing undergoes a fairly protracted developmental time course but the neural underpinnings are not well understood. Prior fMRI studies have only examined progressive changes (i.e., increases in specialization in certain regions with age), which would be predicted by both the Interactive Specialization (IS) and maturational theories of neural development. To differentiate between these accounts, the present study also examined regressive changes (i.e., decreases in specialization in certain regions with age), which is predicted by the IS but not maturational account. The fMRI results show that both progressive and regressive changes occur, consistent with IS. Progressive changes mostly occurred in occipital-fusiform and inferior frontal cortex whereas regressive changes largely emerged in parietal and lateral temporal cortices. Moreover, inconsistent with the maturational account, all of the regions involved in face viewing in adults were active in children, with some regions already specialized for face processing by 5 years of age and other regions activated in children but not specifically for faces. Thus, neurodevelopment of face processing involves dynamic interactions among brain regions including age-related increases and decreases in specialization and the involvement of different regions at different ages. These results are more consistent with IS than maturational models of neural development. PMID:21399706
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
Three-dimensional analysis of third molar development to estimate age of majority.
Márquez-Ruiz, Ana Belén; Treviño-Tijerina, María Concepción; González-Herrera, Lucas; Sánchez, Belén; González-Ramírez, Amanda Rocío; Valenzuela, Aurora
2017-09-01
Third molars are one of the few biological markers available for age estimation in undocumented juveniles close the legal age of majority, assuming an age of 18years as the most frequent legal demarcation between child and adult status. To obtain more accurate visualization and evaluation of third molar mineralization patterns from computed tomography images, a new software application, DentaVol©, was developed. Third molar mineralization according to qualitative (Demirjian's maturational stage) and quantitative parameters (third molar volume) of dental development was assessed in multi-slice helical computed tomography images of both maxillary arches displayed by DentaVol© from 135 individuals (62 females and 73 males) aged between 14 and 23years. Intra- and inter-observer agreement values were remarkably high for both evaluation procedures and for all third molars. A linear correlation between third molar mineralization and chronological age was found, with third molar maturity occurring earlier in males than in females. Assessment of dental development with both procedures, by using DentaVol© software, can be considered a good indicator of age of majority (18years or older) in all third molars. Our results indicated that virtual computed tomography imaging can be considered a valid alternative to orthopantomography for evaluations of third molar mineralization, and therefore a complementary tool for determining the age of majority. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
Laskowska-Macios, Karolina; Nys, Julie; Hu, Tjing-Tjing; Zapasnik, Monika; Van der Perren, Anke; Kossut, Malgorzata; Burnat, Kalina; Arckens, Lutgarde
2015-08-14
Binocular pattern deprivation from eye opening (early BD) delays the maturation of the primary visual cortex. This delay is more pronounced for the peripheral than the central visual field representation within area 17, particularly between the age of 2 and 4 months [Laskowska-Macios, Cereb Cortex, 2014]. In this study, we probed for related dynamic changes in the cortical proteome. We introduced age, cortical region and BD as principal variables in a 2-D DIGE screen of area 17. In this way we explored the potential of BD-related protein expression changes between central and peripheral area 17 of 2- and 4-month-old BD (2BD, 4BD) kittens as a valid parameter towards the identification of brain maturation-related molecular processes. Consistent with the maturation delay, distinct developmental protein expression changes observed for normal kittens were postponed by BD, especially in the peripheral region. These BD-induced proteomic changes suggest a negative regulation of neurite outgrowth, synaptic transmission and clathrin-mediated endocytosis, thereby implicating these processes in normal experience-induced visual cortex maturation. Verification of the expression of proteins from each of the biological processes via Western analysis disclosed that some of the transient proteomic changes correlate to the distinct behavioral outcome in adult life, depending on timing and duration of the BD period [Neuroscience 2013;255:99-109]. Taken together, the plasticity potential to recover from BD, in relation to ensuing restoration of normal visual input, appears to rely on specific protein expression changes and cellular processes induced by the loss of pattern vision in early life.
Development and Test of Robotically Assisted Extravehicular Activity Gloves
NASA Technical Reports Server (NTRS)
Rogers, Jonathan M.; Peters, Benjamin J.; Laske, Evan A.; McBryan, Emily R.
2017-01-01
Over the past two years, the High Performance EVA Glove (HPEG) project under NASA's Space Technology Mission Directorate (STMD) funded an effort to develop an electromechanically-assisted space suit glove. The project was a collaboration between the Johnson Space Center's Software, Robotics, and Simulation Division and the Crew and Thermal Systems division. The project sought to combine finger actuator technology developed for Robonaut 2 with the softgoods from the ILC Phase VI EVA glove. The Space Suit RoboGlove (SSRG) uses a system of three linear actuators to pull synthetic tendons attached to the glove's fingers to augment flexion of the user's fingers. To detect the user's inputs, the system utilizes a combination of string potentiometers along the back of the fingers and force sensitive resistors integrated into the fingertips of the glove cover layer. This paper discusses the development process from initial concepts through two major phases of prototypes, and the results of initial human testing. Initial work on the project focused on creating a functioning proof of concept, designing the softgoods integration, and demonstrating augmented grip strength with the actuators. The second year of the project focused on upgrading the actuators, sensors, and software with the overall goal of creating a system that moves with the user's fingers in order to reduce fatigue associated with the operation of a pressurized glove system. This paper also discusses considerations for a flight system based on this prototype development and address where further work is required to mature the technology.
Recce NG: from Recce sensor to image intelligence (IMINT)
NASA Astrophysics Data System (ADS)
Larroque, Serge
2001-12-01
Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.
SWiFT Software Quality Assurance Plan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, Jonathan Charles
This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan
Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne
2012-01-01
This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812
Code of Federal Regulations, 2010 CFR
2010-01-01
... Citrus Industry, Part 1, Chapter 20-13 Market Classification, Maturity Standards and Processing or... 2065-S, 14th and Independence Ave., Washington, DC 20250 or at the National Archives and Records...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. ...
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...