Author Archives: pooja11

Free Software

In the 1950s, 1960s, and until the early 1970s, it was normal for computer users to have the software freedoms associated with free software. Software was commonly shared by individuals who used computers and by hardware manufacturers who welcomed the fact that people were making software that made their hardware useful.

By the early 1970s, the picture changed: software costs were dramatically increasing, a growing software industry was competing with the hardware manufacturer’s bundled software products, leased machines required software support while providing no revenue for software, and some customers able to better meet their own needs did not want the costs of free software bundled with hardware product costs.

In 1983, Richard Stallman, longtime member of the hacker community at the MIT Artificial Intelligence Laboratory, announced the GNU project, saying that he had become frustrated with the effects of the change in culture of the computer industry and its users. Software development for the GNU operating system began in January 1984, and the Free Software Foundation (FSF) was founded in October 1985. He developed a free software definition and the concept of copyleft, designed to ensure software freedom for all.

The first formal definition of free software was published by FSF in February 1986, and states that software is free software if people who receive a copy of the software have the following four freedoms –

  • Freedom 0: The freedom to run the program for any purpose.
  • Freedom 1: The freedom to study how the program works, and change it to make it do what you wish.
  • Freedom 2: The freedom to redistribute copies so you can help your neighbor.
  • Freedom 3: The freedom to improve the program, and release your improvements (and modified versions in general) to the public, so that the whole community benefits.

Freedoms 1 and 3 require source code to be available because studying and modifying software without its source code is highly impractical.

Licensing

A free software license is a software license which grants recipients extensive rights to modify and redistribute, which would otherwise be prohibited by copyright law. To qualify as a free software license, the license must grant the rights described in The Free Software Definition or one of the similar definitions based on this.he majority of free software falls under a small set of licenses. The most popular of these licenses are:

There are different categories of free software :

  • Public domain software: the copyright has expired, the work was not copyrighted, or the author has released the software onto the public domain (in countries where this is possible). Since public-domain software lacks copyright protection, it may be freely incorporated into any work, whether proprietary or free.
  • Permissive licenses, also called BSD-style because they are applied to much of the software distributed with the BSD operating systems: these licenses are also known as copyfree as they have no restrictions on distribution.The author retains copyright solely to disclaim warranty and require proper attribution of modified works, and permits redistribution and any modification, even closed source ones.
  • Copyleft licenses, with the GNU General Public License being the most prominent: the author retains copyright and permits redistribution under the restriction that all such redistribution is licensed under the same license. Additions and modifications by others must also be licensed under the same “copyleft” license whenever they are distributed with part of the original licensed product. This is also known as a Viral license. Due to the restriction on distribution not everyone considers this type of license to be free.

Some the well known examples –  Linux Kernel, the BSD and Linux operating systems, the GNU Compiler Collection and C library; the MySQL relational database; the Apache web server. Other influential examples include the emacs text editor; the GIMP raster drawing and image editor;  the LibreOffice office suite; and the TeX and LaTeX typesetting systems.

Advertisements

Usability Testing

Usability testing is a technique used in user-centered interaction design to evaluate a product by testing it on users. This can be seen as an irreplaceable usability practice, since it gives direct input on how real users use the system. Usability testing focuses on measuring a human-made product’s capacity to meet its intended purpose. Examples of products that commonly benefit from usability testing are foods, consumer products, web-sites or web applications, computer-interfaces, documents, and devices.

In software, usability means the software’s capability to be learned and understood easily and how attractive it looks to the end user. Usability Testing is a black box testing technique. This tests the following features of the software :

  • how easy it is to use the software
  • how easy it is to learn the software
  • how convenient is the software to the end user

In Usability Testing, a small-set of target end-users, of a  software system,  use it to expose usability defects. This testing mainly focuses on the user’s-ease to use the application, flexibility in handling controls and ability of the system to meet its objectives.

Usability testing determines whether an application is :

  • Useful
  • Findable
  • Accessible
  • Usable
  • Desirable

Goals of usability testing :

1.  Effectiveness of the system

  • Is the system is easy to learn
  • Is the system useful and adds value to the target audience?
  • Is Content, Color, Icons, Images used are aesthetically pleasing ?

2. Efficiency

  • Navigation required to reach desired screen/webpage should be very less. Scroll bars shouldn’t be used frequently.
  • Uniformity in the format of screen/pages in your application/website.
  • Provision to search within your software application or website.

3. Accuracy

  • No outdated or incorrect data like contact information/address should be present.
  • No broken links should be present.

 4. User Friendliness

  • Controls used should be self-explanatory and must not require training to operate.
  • Help should be provided for the users to understand the application / website.
  • Alignment with above goals helps in effective usability testing.

Methods of usability testing :

Laboratory Usability Testing: This testing is conducted in a separate lab room in presence of the observers. The testers are assigned tasks to execute. The role of the observer is to monitor behavior of the testers and report outcome of  testing. The observer remains silent during the course of testing.  In this testing both observers and testers are present in same physical location.

Remote Usability Testing : Under this testing observers and testers are remotely located. Testers access the System Under Test, remotely and perform assigned tasks. Tester’s voice , screen activity , testers facial expressions are recorded by an automated software. Observers analyze this data and report findings of the test.

Object-Oriented Programming Paradigm

The terms objects and oriented in something like the modern sense of object-oriented programming seem to make their first appearance at MIT in the late 1950s and early 1960s. Objects as a formal concept in programming were introduced in the 1960s in Simula 67, a programming language designed for discrete event simulation, created by Ole-Johan Dahl and Kristen Nygaard of the Norwegian computing Center in Oslo. Object-oriented programming developed as the dominant programming methodology in the early and mid 1990s when programming languages supporting the techniques became widely available

Object-oriented programming (OOP) is a programming paradigm that represents concepts as objects that have data fields and associated procedures known as methods. Objects, which are usually instances of classes, are used to interact with one another to design applications and programs.

Here each object is capable of receiving messages, processing data, and sending messages to other objects. Each object can be viewed as an independent machine with a distinct role or responsibility. The methods on these objects are closely associated with the object.

The object-oriented approach encourages the programmer to place data where where it is not directly accessible by the rest of the program. Instead, the data is accessed by calling specially written functions, commonly called methods, which are bundled in with the data. The programming construct that combines data with a set of methods for accessing and managing those data is called an object.

Fundamental concepts and features :

1) Dynamic dispatch – When a method is invoked on an object, the object itself determines what code gets executed by looking up the method at run time in a table associated with the object. This feature distinguishes an object from an abstract data type (or module), which has a fixed (static) implementation of the operations for all instances. It is a programming methodology that gives modular component development while at the same time being very efficient.

2) Encapsulation – Encapsulation means that the internal representation of an object is generally hidden from view outside of the object’s definition. Typically, only the object’s own methods can directly inspect or manipulate its fields. Hiding the internals of the object protects its integrity by preventing users from setting the internal data of the component into an invalid or inconsistent state. A benefit of encapsulation is that it can reduce system complexity, and thus increases robustness, by allowing the developer to limit the inter-dependencies between software components.

3) Inheritance – It is a way to reuse code of existing objects, or to establish a subtype from an existing object, or both, depending upon programming language support.

4) Decoupling – Decoupling refers to careful controls that separate code modules from particular use cases, which increases code re-usability. A common use of decoupling in OOP is to polymorphically decouple the encapsulation.

5) Abstraction – Abstraction captures only those details about an object that are relevant to the current perspective. Abstraction can apply to control or to data: Control abstraction is the abstraction of actions while data abstraction is that of data structures.

Other OOP concepts include – Classes, instances, methods etc.

OOP language

1) Pure OO languages, because everything in them is treated consistently as an object, from primitives such as characters and punctuation, all the way up to whole classes, prototypes, blocks, modules, etc. They were designed specifically to facilitate, even enforce, OO methods. Examples: Eiffle, Emerald, Ruby, Scala, Smaaltalk.

2) Languages designed mainly for OO programming, but with some procedural elements. Examples: C++, JAVA, C#, VB.NET, Python.

3) Languages that are historically procedural languages, but have been extended with some OO features. Examples: Visual Basic,Fortran, Perl, COBOL, PHP.

4) Languages with most of the features of objects (classes, methods, inheritance, reusability), but in a distinctly original form. Examples: Oberon and Common Lisp.

 

 

 

 

 

Lab Evaluation 5 – Pooja Pande

Short executive note describing the software for lay users.

https://www.box.com/s/sd6icu9innuqilw6gopf

Lab evaluation 4 – Pooja pande

1. The clauses of ISO 9001 and ISO 9126 standards applicable for the case – From ISO 9126 following features are added –

Reliability
-fault tolerance
-recoverability

Usability
-understandability
-learnability
-operability

Maintainability
-Testability
-stability
-changeability

Portability
-Adaptability
-Installability
-Conformance
-replace-ability

From ISO 9001 following clauses are added –
1.Documentation Requirements
2.Management Responsibility
3.Customer Focus
4.Quality Policy
5.Planning
6.Resource Management
7.Validation of processes for production and service provision
8.Identification & Traceability
9.Measurement Analysis & Improvement
10.Customer Satisfaction
11.Internal Audit
12.Monitoring and measurement of processes
13.monitoring and measurement of product
14.Control of nonconforming product

2. Software configuration management process used for build and process management
Traditional SCM process is looked upon as the best fit solution to handling changes in software projects. Traditional SCM process identifies the functional and physical attributes of a software at various points in time and performs systematic control of changes to the identified attributes for the purpose of maintaining software integrity and traceability throughout the software development life cycle.

The SCM process further defines the need to trace the changes and the ability to verify that the final delivered software has all the planned enhancements that are supposed to be part of the release.

The traditional SCM identifies four procedures that must be defined for each software project to ensure a good SCM process is implemented. They are

1. Configuration Identification

Software is usually made up of several programs. Each program, its related documentation and data can be called as a”configurable item”(CI). The number of CI in any software project and the grouping of artifacts that make up a CI is a decision made of the project. The end product is made up of a bunch of Cis. The status of the CIs at a given point in time is called as a baseline. The baseline serves as a reference point in the software development life cycle. Each new baseline is the sum total of an older baseline plus a series of approved changes made on the CI

A baseline is considered to have the following attributes

Functionally complete
A baseline will have a defined functionality. The features and functions of this particular baseline will be documented and available for reference. Thus the capabilities of the software at a particular baseline is well known.

Known Quality
The quality of a baseline will be well defined. i.e. all known bugs will be documented and the software will have undergone a complete round of testing before being put define as the baseline.

Immutable and completely recreatable
A baseline, once defined, cannot be changed. The list of the CIs and their versions are set in stone. Also, all the CIs will be under version control so the baseline can be recreated at any point in time.

2. Configuration Control
The process of deciding, co-ordinating the approved changes for the proposed CIs and implementing the changes on the appropriate baseline is called Configuration control.
It should be kept in mind that configuration control only addresses the process after changes are approved. The act of evaluating and approving changes to software comes under the purview of an entirely different process called change control.

3. Configuration Status Accounting
Configuration status accounting is the bookkeeping process of each release. This procedure involves tracking what is in each version of software and the changes that lead to this version.
Configuration status accounting keeps a record of all the changes made to the previous baseline to reach the new baseline.

4. Configuration Authentication
Configuration authentication (CA) is the process of assuring that the new baseline has all the planned and approved changes incorporated. The process involves verifying that all the functional aspects of the software is complete and also the completeness of the delivery in terms of the right programs, documentation and data are being delivered.
The configuration authentication is an audit performed on the delivery before it is opened to the entire world.

3. Features to be tested –

1.Reliability- Software Reliability is the probability that software will work properly in specified environment and for given time.

Probability = Number of cases when we find failure / Total number of cases under consideration
-Test cases and test procedure for each software module
-Time constraints are handled by applying fix dates or deadlines to the tests to be performed
-Mean Time to Failure (MTTF) , Mean time Between failure (MTBF). Mean time to repair(MTTR) are calculated.

2.Compatibility – Here non-functional parts of software are tested.
-Computing capacity of Hardware Platform
-Operating systems; Linux, Windows, etc.
-Other System Software (Web server, networking/ messaging tool, etc.)
-Browser compatibility (Chrome, Firefox, Netscape, Internet Explorer, Safari, etc.)

3.Usability – For usability testing, people of various age groups are asked to access the web frame work and the response in mainly four areas is measured-
-Efficiency — How much time, and how many steps, are required for people to complete basic tasks? (For example, find something to buy, create a new account, and order the item.)
-Accuracy — How many mistakes did people make?
-Recall — How much does the person remember afterwards or after periods of non-use?
-Emotional response — How does the person feel about the tasks completed? Is the person confident, stressed? Would the user recommend this system to a friend?

Universal Design

Ron Mac coined the term Universal Design to describe the concept of designing all products and the built environment to be aesthetic and usable to the greatest extent possible by everyone, regardless of their age, ability or status in life.

          For example, one person could be six feet tall, an excellent reader, primarily a visual learner and deaf. All of these characteristics including his deafness should be considered when developing a product.

        Making a product or an environment accessible to people with disabilities often benefits others. For example, automatic door opener benefit individuals using walkers and wheelchairs, but also benefit people carrying groceries and holding babies, as well as elderly citizens.

Barriers to standard computer software limit opportunities in education and employment for some people with disabilities.For example, a part of multimedia tutorial that uses voice narration without captioning or transcription is inaccessible to students who are deaf. Also, a software program that requires an unnecessary high level of reading may be inaccessible to some people who have learning disabilities.

The UD can be applied to software in the following ways –

  1. Keyboard access to software – This will make the software accessible to who cannot use mouse or other pointing device.

    For example, a person with disability that affects dexterity may find it impossible to move or hold a pointing device with enough accuracy to activate destined feature.

  1. When animation is displayed, information shall be displayed in at-least one non-animated presentation mode. The use of animation on a screen can pose serious access problems for users of screen reader.
  2. Color coding shall not be used as the only means of conveying information, indicating an action promoting a response or distinguishing a visual element.
  3. With adjust color and contrast settings, range of contrast level should be provided.

    Many people experience a high degree of sensitivity to bright displays. People with this condition cannot focus on a bright screen for long because soon they will be unable to distinguish individual letters.

    On other hand, many people with low vision can work most efficiently when the screen is set with very sharp contrast settings.

  4. Software shall not use use flashing / blinking text, objects. Some individuals with photo-sensitivity epilepsy can have a seizure triggered by displays that flicker or flash.When a software is designed to be accessible to individual with a broad range of disabilities, it is more usable by others.

    For example, providing text captions to multimedia presentation with speech output can provide access to the context for both a user who is deaf and one for whom English is a second language.

     

CMMI – DEV V1.2

As some of the recently posted blogs have talked about CMMI, we will directly be going into the details of CMMI V1.2 for development.

CMMI-DEV basically is a collection of practices that, when implemented, improve product quality and also both project and organizational performance or we can say this model is a tool that helps organizations to improve their ability to develop and maintain quality products and services. CMMI-DEV organizes practices into 22 “process areas” with 13 practices, called “generic practices” .Based on the business needs and priorities, organizations select one or more process areas and sets of generic practices to implement.
A new version of CMMI is released every 3-5years. This process includes receiving requests for changes from users of previous version and these change requests are reviewed by a team of members from industry, government, and the SEI to determine the changes to make to the process areas, practices, and their descriptions.
The photo attached gives the overview of the process area of CMMI DEV and to which Maturity Level & category it belongs to in CMMI.

process-areas12

1. Requirements Management (REQM) : manage all requirements received or generated by the project, including both technical and nontechnical requirements. also the requirements imposed by organization.

2. Project Planning (PP) : This area involves following activities

  • development of project plan.
  • interacting with the stakeholders.
  • maintain the plan developed.

3. Project Monitoring and Control (PMC) : In this process area we comprehend the project’s progress so that concerned corrective steps can be taken when the project’s performance deviates from the plan developed above.

4.Supplier Agreement Management (SAM) : This manage the acquisition of products and services from suppliers. A supplier agreement is established to manage the relationship between the organization and the supplier. A supplier agreement is any written agreement between the organization (representing the project) and the supplier. This agreement can be a contract, license, service level agreement, or memorandum of agreement.

5. Quantitative Project Management (QPM): This helps in achieving the project’s established quality and process performance objectives.

6. Integrated Project Management (IPM) : This basically establishes the project’s define process at project startup and manages the project using the project’s defined process. This also addresses the coordination of all activities associated with the project.

7. Risk Management (RSKM): As we all know, risk management is a continuous, that is an important part of project management. This involves the identification of potential risk before the occur, so that the predefined steps can be taken as and when needed throughout the life of project.

8. Measurement and Analysis (MA) : This involves-

  • specifying objectives of measurement and analysis.
  • specifying measures, analysis techniques, and mechanism for data collection, storage, reporting and feedback.
  • implementing the above specified techniques.

9. Process and Product Quality Assurance (PPQA): This involves-

  • evaluating the performed processed and work products against standards.
  • identifying and documenting the noncompliance issues.
  • giving feedback to project staff and managers.
  • ensuring that noncompliance issues are addressed.

10. Configuration Management (CM) : This serves the propose of establishing and maintaining the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits.

11. Decision Analysis and Resolution (DAR): The purpose of this process area is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. This involves establishing guidelines to determine which issues should be subject to a formal evaluation process and applying formal evaluation processes to these issues.

12. Causal Analysis and Resolution (CAR): The purpose of CAR is to identify causes of selected outcomes and take action to improve process performance, which in turns improves quality and productivity by preventing the introduction of defects or problems and by identifying and appropriately incorporating the causes of superior process performance.

13. Requirements Development (RD): The purpose of this process area is to establish requirements, which includes customer requirements, product requirements, and product component requirements.

14. Technical Solution (TS): This process area is applicable at any level of the product architecture. The purpose is to select, design, and implement solutions to requirements.

15. Product Integration (PI): This process area basically deals with the integration of components into more complex components or even complete product. Also it ensure that the product after integration behaves properly and then delivers the product.

16. Validation (VAL): As the term validation means finding or testing the truth of something in the same context this is used here. The purpose of validation is to check that the product fulfills its intended use when placed the environment where the product is to be used.

17.Verification (VER): This process area involves the verification of preparation and performance. The purpose is to ensure that selected work products meet their specified requirements.

18. Organizational Process Focus (OPF): This process area involves the through study and understanding of the organization’s processes and process assets so as to come out with the strengths and weakness to make the improvements.

19. Organizational Process Definition (OPD): This process area enable consistent process execution across the organization and provide a long-term benefits to the organization. The purpose is to is to establish and maintain a usable set of organizational process assets, work environment standards, and rules and guidelines for teams.

20.Organizational Training (OT): This deals with the training provided in the organization to develop skills and knowledge of people so they can perform their roles effectively and efficiently.

21.Organizational Process Performance (OPP) : The purpose is to establish and maintain a quantitative understanding of the performance of selected processes in the organization’s set of standard processes in support of achieving quality and process performance objectives, and to provide process performance data, baselines, and models to quantitatively manage the organization’s projects.

22. Organizational Performance Management (OPM) : This involves the iteratively analyzing aggregated project data, identifying gaps in performance against the business objectives, and selecting and deploying improvements to close the gaps so that the organization meets its business objectives.

The following link gives the detailed description of the above mentioned process areas -http://www.cmmi.de/#el=CMMI-DEV/0/HEAD/folder/folder.CMMI-DEV