Dilruba Malik Quality is not about finding defects
Video Transcription
So I wanted to uh thank you o tech network to giving me the opportunity to connect uh with all of you. It's kind of like bond beyond the boundaries.I can see um and talk to all of you during this pandemic where um we are all staying at home uh dealing with our personal uh challenges. Uh It is a good opportunity for us to um connect with each other. Uh share our thoughts, our knowledge and everything. So today, I'm going to talk about uh quality. Quality is not about finding defenses, how I come up with this um title, right? Uh during uh one of my conversation with my um husband and my kids, right? They are doing something and I find issues and my husband always figured that um oh, you work on uh Q A so you always uh find issues and that strike me that oh people think about quality is just finding a bugs but uh or finding defects, right? Uh But for me, it's not. So that's why I come up with this uh title. That quality is not about finding defects. So I want to uh set the background. So Uh I'm actually from Q A profession. So uh I'm with Q A engineering experience for last 15 plus years. Currently, I'm uh leading uh Q A team at Paul Walter Networks. So, uh what is Q A uh what is quality? Right?
So when you Google it, you will find uh this is uh the dictionary meaning of quality, right? The standard of something, software or hardware as measured against the specification and the degree of excellence, right? What what does that mean? That means you have certain expectation and uh you should, when you are looking at it, you should get that expected behavior or expected result um as per the expectation. And uh I often get this question that who uh only Q A uh are responsible for quality? Is it? I believe that quality is everybody's responsibility, right? Um I believe that quality is um everyone's responsibility because uh developers are doing their parts by uh writing their unique test, right? And product owners are um there to check uh which feature will go to this release and what is the priority and all that? So product owners are doing their part ux designer uh care about uh user experience uh right. Um And uh U I design and all that security tester check for vulnerabilities, right? Performance tester are taking care of all nonfunctional testing, for example, stress um um low test and all that. So everybody is contributing to the overall product quality, right? So, in order to get a quality product, everybody need to contribute. So, uh so I don't claim that Q A is the uh solely owner for quality. So that's the reason uh I felt uh that, you know, quality is everybody's responsibility.
All of us are uh providing our share uh into the P So uh what is the role of QA I want to give a generic idea, right? Quality assurance is um defined as an um activity to ensure that um organization is providing the best possible product or service to customers. Q A focus on improving process and deliver quality products to the customer. So we are the middlemen between the development and the release um and to the end user customer as per se, right? So um our job is to identify um something is not working as per the specification as well as to set a standard where um we can understand that um you know, what is going on uh if development uh claim that the feature is working, uh what is working uh which extend, for example, feature wise, maybe it's working.
But for Q A, our responsibility of um the um verifying is that that feature is working as well as when we integrate with other part of the product, it also should work, right? So the handshake between the other component is also working and we are trying to figure out the um boundary cases or rare instance or negative test cases for that instance, right. So what um Q A do day to day, right. So um we understand the product um in order to be a good Q A, uh we have to understand the technical architecture, right? How is uh each of the pieces are glued together basically. So um specifically how data is flowing, right? What customer is seeing in U I as well as how it is visible to U I, uh how backend is working. So how all these pieces are connected together, understanding technical architecture will give us that view, right? And then definitely validate the functional specification. We have to do the functional testing, right? And then um identify defects is one of the uh task for sure. Uh we have to identify if um whether certain specification is met that goal or not, right? If it's not, then definitely it's a issue. So we have to raise it and we have to work with the developer to solve it, right? And automation nowadays, I don't see any manual test um Q A role at all. It's most uh of like combined role.
So as we have to do manual Q A as well as we have to do automation so that we can uh practice um uh how can we um catch the issue or catch the defect um before our customer, right. So in order to do that, we have to invest on automation. So um I want to talk about uh Q A mindset because Q A mindset is basically um to understand how holistically we have to view this. So, testing is actually uh verifying, exploring and automating, right? That's what um makes a good um testing. So uh a lot of other things we have to do, for example, uh in input validation, right? So whether when customers or uh when you end user are logging into our product, right? Uh what they are going to put in that login and password field we cannot control. But in order to be safe, we have to make sure that we are doing input validation, right? And the boundary testing, um we don't know um you know, if there is a specification for the uh particular feature, we have to uh for example, if we have 1 to 10 range, right, we have to make sure uh 11 doesn't work or negative one doesn't work. So we have to do the boundary testing and then we have to do functional testing uh for the every feature, right? And um we have to understand multi uh browser testing capability, right?
We have to understand if the product wise it is capable of running in Chrome or um Safari or um is it any specified that we will uh support only one browser or uh unlimited browser? Right? And then uh backward compatibility is also our task to figure out so that we figure out the regression testing, right, regression issues that if anything uh used to work last uh build, but right now it's not working. So how uh this uh product is progressing, right? We have to make sure we also be able to do the backward compatibility and then a nonfunctional testing like stress testing, low testing, performance testing, right? We have to think about that when we are um testing a feature. So be best practices for Q A is um from my perspective, I see what it works so far for me is um establish efficient process, right? So we have to understand how we want to do um day to day. What is our handshake? Right? For example, de developers are saying that, you know, they are done with their future development and uh when uh Q A go and try to um verify it, then it's not available, right? The build is not deployed or uh it is in the development environment, it's not in staging. So how that handshake will happen.
So we have to establish a efficient efficient process so that we understand the handshake, we understand the uh expectation from Q A as well as from other team. Uh In this example, I just talked about uh development, but then uh there are a lot of other dependency. For example, the uh requirement. If it's not set properly or clearly, then we don't know what we should um verify again. So establish an efficient process is very important for Q A and then robust testing environment, right? So we have to have a testing environment where um we have very close to customer data. Um I'm sure uh it's uh it's not possible to have um customer data in uh testing environment, but we want to simulate our data so that we can test as close um to customer environment and then release acceptance criteria, right? How, what we will accept for this particular release um uh or the entire release process, right? Overall release process. Uh we will uh if we are not allowing any P one P two bugs, then we have to make sure we um say it out loud and clear that, you know, we are not accepting any P one P two bugs um in the release and also um acceptance criteria for um if anything.
Um we have to accept, for example, if P two bugs, uh we are not fixing for XYZ reason, but we have to release um because some customer is waiting for it or something uh some urgent uh scenario, then we have to make sure we uh spell that out in our release note, right? So, um those criteria have to be um acknowledged by the product team, by the uh development team, uh and by Q A team obviously, and then bug prioritization, right? How we prioritize the bug, you know, customer found bug, should we always see one? Uh We want to fix all the customer issue as soon as possible. But um development have lot of backlog, uh maybe feature development is going and a lot of uh uh new infrastructure related issue is coming. So uh it's not always possible to prioritize uh P uh all the customer issues as a P one. In that case, uh we'll figure out what is the major uh issue. Then there will be P one customer found back which we should fix ASAP m most probably 24 hour or, you know, depends on company to company, how they um set the rule, right? And then uh P two and P three, we have to take care after that, but then uh customer bugs will be the first priority and the internal found bug for our feature related bug or uh cross functional um feature related issues, right? Integration issues.
Those bug have to be prior prioritized uh you know, weekly basis to understand that it is going uh properly. And then uh definitely functional security and performance testing should be um uh another uh priority to make sure the uh all our test cases are securing uh our product, right? The um uh security vulnerability scan doesn't flag anything for our particular um product. And then automation is the key, right? And so more efficient uh process uh as I was mentioning, right? Um establishing an acceptance and exit uh criteria always uh keep your test cases up to date is um another process we need to uh pay very good attention to and then be on top of um customer issues as I was mentioning and then transform your customer issues to test cases, right?
So we are solving one customer issue. But as you know, if um we don't uh test it constantly, um we don't cover that in our test cases, then it will show up in different customers. So we have to have a grip on that and make sure we uh transform our um customer issues to test cases and then story task and bug relationship, we have to. So every uh story will have multiple tasks and multiple task um um might introduce multiple bugs, right? And if we don't uh link to task and bug, if there is no relationship, everything will be lost. Right? After six months, we don't know which feature we are filing bug uh about, right? So we have to create the process of uh bug and story relationship. Um It could be through component could be um you know, through uh relates to our, however, uh people are um whatever software people are using, right? Um using solely uh Jira. So Jira have very good uh coverage with uh that relationship. So um I'm using that and temp tempts everyday task, for example, bug filing, right? Every Q A will file differently. So if we have a um template, right, which each uh Q A engineer can follow, then whoever filing a bug in your team, it will look like the same, right?
Because we are following the template, same goes to uh test case writing and status update and all that, right? And as I was mentioning uh robust testing environment, um we um should always shoot for three different um environment, right? Developer environment where developer are doing the development work and uh staging environment where Q A is testing solely. And after Q A sign up the uh staging environment, then it goes to pre-production uh which might have uh more of white level customer data and uh check the production um I mean uh product there. And after we are satisfied, Q A give a green light, then we can go to the production, right? And then on-demand uh test uh environment deployment is another uh important thing for um uh Q A environment, right? Because for example, uh once we release a product, customer will not file the bug in that latest release, right? So customer will find maybe um can uh build uh earlier or something like that, right? So we have to have the ability to deploy on demand uh in the test environment, most of the company already have it. But um if um it's not there, it's time to get that robust uh testing environment ready, right?
And make sure your testing environment is under C I CD pipeline, right? So yeah, you don't have to go through uh manual steps to deploy a new build or old build. Um So that you don't waste um a lot of time. Uh If you are doing that, then we should uh put uh some investment, um some understanding of how quickly you can make it under C I CD process. So that uh your test team will be much more uh efficient, right? Q A should always test on uh staging um with real um near real time data uh as I was mentioning, right? Uh the most of the company cannot have um it's a confidential um customer data, we cannot have the test testing environment uh staging environment. So uh we should have near real time or near customer uh data release acceptance criteria. As I was mentioning uh P one P two bugs should be taken care for release and only testable uh features should be delivered, right? Uh We don't want to uh deliver a part of big feature, right? So if we don't know how to test it, uh then it can create a lot more regression that time, right? And then um non issues should be on a release note um as I was mentioning, right? For example, one of the P two bug, we cannot uh fix it uh because of the time crunch or uh backlog or the TED DEV or whatever the reason is, right?
We have to uh put those uh P one P two issues. If we didn't fix it, we have to put it on release, not teed up should be taken care uh frequently. So most of the time we put a lot of things on hold because uh we are working on release uh feature release, right? For example, I need to write a documentation on how to set up this environment and I know that it's a uh essential things to do. But since I'm busy, um my release is uh on Friday. Uh I don't have time so I'll just put that task and um uh go with the release, right? Excuse me. So um after that, uh I, I might forget, right? Uh So I have to keep track of what are the uh technical gap I have. So, documentation is one of the thing. Uh automation is always uh the other thing that we always put uh out because, you know, we don't have time, but we always have to take care of this thing whenever um we have some time. For example, if we release every two week, then uh keep a week or of um time so that we can go over our tech tab, we can go over uh whatever pending task we have, right? New, really a new feature should not um introduce regression.
That's the one of the uh thing we always have to take care that um new feature work as part of the specification as well as um the other um component is not broken by this new feature. So there should be no regression defects for that bug. Privatization. Yeah, bug uh triaging should be a regular practice. Um Common understanding of uh what should be in P one and what should be on P two and so on. Right. Uh Most of the time it's P ONE to P four. I, I have seen uh most of the companies but uh yeah, it based on your um company uh standard that how they are tracking the severity um rate or pri priority. Uh So customer foundation issue should, should be the first priority for sure. Functional security and performance testing, right? Q A does uh most of the time functional testing which is day to day activity, security test case should be always on uh Q A's mind because uh maybe some other uh team uh security team is taking care of certain things but uh very minimum, at least we should um cover the basic things, right?
That uh input validation is one thing. Second thing could be the password, right? It should not be so simple password. Um The do we have the password restriction and all that? So this kind of security test case, we should uh always verify and check uh uh from QA I performance of uh application, right? Or production uh threshold should be identified as baseline and always optimized. So, uh as of today, how my uh application is doing unless we are doing performance testing, we will not have that baseline, right? If we don't have baseline, how we can optimize. So this is a common practice. Um And Q A should always think about it uh that uh you know, uh we have to do the performance testing maybe earlier stage in the um product is not possible, but uh always think about um future uh forward looking way and make sure have a performance test uh plan in the mind and automation, right?
Automation takes um uh makes Q a life easier so that, you know, instead of uh doing the manual testing again and again, for the um U I or for back end, if we have the automation that makes uh makes it easier, right? And how we decide um what to automate or what not to automate. So basically, um each of the test cases, you see that uh you are testing every release then definitely we should automate that and make sure uh that uh you know, you don't have to um you know, monitor those test cases or do execute the test cases every um every day, right? So you can concentrate on some bigger issues and uh let uh automation take care of the all time consuming um uh issues, uh test cases, right? I need to understand uh what to automate and when to automate, right? Sometimes we don't have time during the release. So we have to think about uh you know, when we have some low time. So we can start automating, but it is good practice to uh do some auto, some percentage of automation every week so that it uh the backlog of automating stuff doesn't grow so big that um yeah, you know, you have to pay much uh more price on that.
So I want to cover. Um uh So all that was in uh Q A mindset I wanted to cover. So what uh Q A every day, day to day do and how should they think about uh testing a product right now? Uh let's uh shift a little bit of uh methodology, right? So, so far I have work on uh waterfall model. Um And then agile right now, I'm working on Agile model, but uh it doesn't matter which model we are working on. Uh I hope you already know what is waterfall model. So waterfall model is basically um a little bit of uh linear process than uh current agile uh model everybody is using, right? So we have to do the requirement gathering, then we have to design uh technical architecture and design our product and then uh development, test deployment and maintenance. So this whole process take quite a long time. So uh I can talk about my experience in Cisco. So Cisco, I saw this process take six months to 12 months. So um our release was every six month uh and some team were uh trying to cut it short. So it was like three month release process, right? And agile model, uh if you see it's doing the similar thing, right? We are planning, we are designing, we're developing, we're testing, we're deploying and a review, we are doing exactly the same thing, but it's much frequent, right?
So for example, if you are following two weeks, print, so in two weeks, we are doing this and then we are um delivering that um uh piece of um uh product or, you know, particular functionality of that product and then moving forward to the same um cycle, right? And so what this is doing, it's basically quick iteration of what we are doing in waterfall. But if you think about it, it's pretty much similar thing you are doing, you are just um getting more faster um uh deployment of, you know, some part of the product um we are delivering um in two weeks. Um So uh we are continuously doing it and um so Q A activity actually happened very fast in agile environment. So we see a lot more uh pressure of testing, for example, uh 10 developers are developing uh 10 different feature and um definitely Q A and depth um proportion is not always 1 to 1, it's always 3 to 1, right? So we have three Q A for example, and then um what 10 developers are developing, we have to test those features and identify the defects and all that uh within a very short amount of time.
If you have two weeks sprint, then maybe Q A get two days of testing uh for this uh maybe 10 particular features. So this is just an example. Um uh it's not hard and so rule, but that's what uh when we are doing that also, keeping track of automation is uh very important. I know that everything is going at the same time and then we have to write the test cases as well, right? So do the manual testing, test case writing and then test case automation. So uh this is a little faster but same way. And then uh we have uh different kind of architecture, right? So we have monolithic architecture and then we have uh microservice architecture, all the newer company use uh microservice architecture. And um uh I would say I would not uh generalize it that all the old company use uh monolithic. But that was the trend that time. So how monolithic architecture work is basically we have um U I and then we have business logic and then we have the back end data interface and database and all that. So testing that and testing microservice is pretty same way, but there are a lot of pros and cons. So uh I want to go over the uh microservice architecture as well.
So as you see uh how the monolithic uh architecture was um uh showing that we have uh U I as well and then we have all the microservices uh who is taking uh taking care of all the flows. Basically, we have the database uh back end and everything for that particular microservice. So one benefit is if um one of the microservice is not working, we can fix, uh we can disable that and um uh create a new microservice to replace that or you know, update that microservice and that functionality will work, right? But for monolithic, you have to basically make sure um you have to apply that fix throughout the application, how it is interacting with the rest of the um component, right? So um as I was saying, let's uh go over monolithic and microservice. Uh So hundreds of uh microservices have to communicate synchronously over the network, right? Making service inter um interconnection, a a significant issue that doesn't affect the monolithic architecture. So the execution and load time and everything. Um so how they are connected that is a um ongoing challenge for microservice and then is microservice is individual uh independent, right? So that uh it could be replaced without affecting the entire system.
Uh This gives micros responsibility over resiliency and forces the developer to design a more robust uh microservice and concurrency in microservice is handled by scaling. So, um and then I want to go over a little bit of what type of testing we do, right. So we have the functional testing as I was mentioning and then we have the nonfunctional testing. So um the functional testing include um unique testing, integration testing, system testing, acceptance and regression testing and nonfunctional testing is basically um performance testing, um security testing, usability, and uh compatibility uh testing. So I wanted to uh go over the all the generic strategy and how uh overall Q A think uh they just don't file a bug. So I hope I uh you could see that that Q A doesn't only file the bug. they have a lot of other activities. So um take away from here is QM Mindset is the key, whether we are testing uh waterfall model or agile model. Uh whether we are testing um you know, monolithic application or microservice application, uh whether we are doing functional testing, nonfunctional testing, we have a lot of things to um take care. But Q A mindset is the key.
Um in order to be successful Q A, you have to understand the overall uh holistic view of the product, not just um you know what functionality is working or not. So, um that's all I have. Uh And if you guys have any question, I can um definitely answer that. Uh I see the question. Uh What is the difference between test cases and test script? Very good question. So um uh test cases is basically you are capturing a scenario, right? That um what you want to follow uh how you want to test it. Uh So for example, in a feature, you might have 100 test cases, right? So how uh end user will follow end to end, for example, you have to log in. So you have to uh provide your user name, your password and then click on login button, right? So that can be your one test cases. And the next test cases uh can be um after you log in, you see the successfully next screen, right? So that could be another test case. What test script does is pretty much similar thing but that is automation, right? So what you are uh testing manually, you can automate them and that will uh test script. Basically any other um question anybody have. So I got another question, best frame uh framework um you can suggest. So it depends on your need um your team. So first of all, you have to identify in your team. Um What is um the expertise for the team?
For example, if you want to do the U I automation, right, which framework you want to use. So depending on if your team members have um Java expertise or if your team members have Python expertise, then you can use uh Selenium with Java or Selenium with Python based on the expertise or javascript in that matter, right? You can use that. But then um the learning curve is pretty low in that. But um if you want to um go with other behavioral drive test cases, uh you can use uh some other thing robot framework and stuff. But um from my experience, I see that um uh Selenium is universally um working for e every company uh whether you are testing monolithic or um microservice. Uh Selenium is very helpful for all the U I test cases and it can go in depth, you can take care of tool tip, you can um check the data inside the chart and everything. So um I am um uh supporting um Selenium um But uh depends on, again, depends on your team expertise and how you are dealing with that, right? Uh I got another question. Um uh Actually, uh I want to mention the automation framework as well, right?
You can use um test NG or uh Jetter for uh performance testing. Just wanted to uh highlight that, that's what we are using. And uh not that what I am using, you have to use the same thing, but you can um identify what is the pros and cons and then you um go for uh your team expertise and go for that uh particular framework. So I get another question um that um uh tips for the Q A test plan. So Q A test plan is basically overall um o overall view of what Q A is going to do, right? So if we have a release, for example, we have release in one week uh one month, right? How you want to uh uh design uh your test plan of testing that feature uh one month away. So you have to basically understand the technical architecture and then you have to start writing test cases even though the development is not there yet, right? You have to uh read through the uh software specification and uh get the test cases. Uh And after you go through the test cases, then how long it will take for your team or you to test that right estimation? And then um what do you see as a potential risk? And what uh you see the dependency, right?
If there is any cross functional dependency, then you have to understand, you know, uh we have to uh tell that in our test plan that, you know, we can do this, this, this uh functional nonfunctional testing, this is our timeline, this is our eta but then uh we have, you know that many test cases, but then we have dependency on this uh particular infrastructure or particular uh piece of software that uh you know, we have to get it from um Xyz team.
Then how we'll be integrating that will be um you know, uh layout in the test lab. And then um is there any other question? I, I think that's all the question I had. Um And definitely um you know, you uh for the test plan also, I forgot to mention that uh you have to provide the bug tracking, right, which project you are using for the bug tracking and all that. So I think um that creates a um good uh test Q A test plan. So, uh if you have any other question, um please feel free to uh connect with me in linkedin. Uh I'm open uh to discussion if you want to discuss if you are having some um um issues or uh something you want to understand about Q A process or generic um uh mentoring also uh is possible, I can, I can be open, open for that. And um I got another question that uh you know, uh certification. So um certification is good to have but not always um you know, not everybody evaluating uh the certificate as the same way. Uh But definitely um you want to understand the product and you want to understand what is the current technological trend, uh front end and back end. So, um and after that, you can decide you want to do the um certification, it's not required in any company.
Uh But it is uh helpful, at least it gives some credential to you. So, for hiring manager when they are looking at um uh looking at your um uh resume uh that uh pop up and that definitely they get interested when you have certain uh specific um skill set. For sure. Yeah. Uh Yeah, I don't see any other question. Uh So thank you so much for um joining this session. I'm very um happy to see you guys here and definitely keep in touch. Um You can connect me via linkedin or uh Twitter or Instagram, which whichever way is comfortable to you. Thank you so much and uh looking forward to seeing you in any other um session like this. Thank you uh Anna and uh o tech network to giving me this opportunity uh to um communicate my knowledge to everyone out there. Yeah. Thanks. Bye. Take care.