QA Blog Posts
Google's Lighthouseis a useful extension to test the performance of any website. It's useful to learn about the load times and suggest ways to improve the site.
This is built into every chrome browser, simply go to the Developer Tools page, Command - Option - I, and then click on the Lighthouse on the top menu bar.
Official Product Description
Lighthouse is an open-source, automated tool for improving the quality of web pages. You can run it against any web page, public, or requiring authentication. It has audits for performance, accessibility, progressive web apps, SEO and more.
You can run Lighthouse in Chrome DevTools, from the command line, or as a Node module. You give Lighthouse a URL to audit, it runs a series of audits against the page, and then it generates a report on how well the page did. From there, use the failing audits as indicators on how to improve the page. Each audit has a reference doc explaining why the audit is important, as well as how to fix it.
Useful for QA?
This is a useful tool to check for broken images that may go unnoticed during testing. Especially since the test is run on another server outside of VPN.
In addition, the "properly size" images can highlight any images that may cause slowness in load times - particularly important for mobile users.
Also useful for QA to test login processes. Since some portions of the site may require logins, you can use this tool to see how much of the site is exposed without logging in.
Easy to run, and just about instant results make it a quick tool to assist QA with some general page testing.
They are constantly adding new audit metrics using industry best practices.
Ask QA: Should You Always Report Bugs?
Someone recently asked me:
Should you keep raising bugs that you know won’t get fixed? Isn't it a waist of people's time during the ticket triage to look at very minor issues?
Answer: First of all, QA should not be looking for reasons to NOT create bug reports. QA should file tickets for every bug that they encounter. Then they should assign the right priority to that issue. It's up to Product/Dev to decide on how to triage the minor issue.
Tickets should be created for every issue. Here are some reasons why:
Sometimes they get fixed: In some instances, I have seen developers fix multiple issues in a common code base just because they have the file open and can make the modifications.
It's a test of a good QA Team - Engineers and Product will see that QA is active and finding things. It shows that QA is looking for issues and reporting them. If customers are reporting issues that QA isn't finding, well that could question how good the QA team is.
Bug Patterns - I have found that finding little bugs can lead to finding the bigger bugs. When developers are sloppy in some areas, it's a sign that they may be sloppy in another area. So when I find a minor issue. It makes me stop and think, is there another issue here that may not be so obvious.
Maybe the Feature isn't Ready, yet - If QA finds a lot of minor issues, it's might be something that the product team to sit back and take notice. Should a new feature/product really go to market with so many minor issues? If they aren't addressed now, when will they be?
Don't be Discouraged
Minor tickets can be frustrating to find and report - even if QA knows that no one will read the full bug. However, there may be a time when the bug will get fix, so it's important to report the issue.
- Don't spend too much time writing up the minor tickets. Just do the bare minimum on reporting. If the Dev team needs clarification, you can always go back and add value to the report.
- Generate a report of the minor bugs. If you do daily or weekly QA reports, make sure to highlight the minor bugs found. This will help highlight the issues that QA has found. There's strength in numbers.
- Combine Issues - When you have some downtime, revisit some of the old minor issues and see if it makes sense to combine several minor issues into a single ticket - which may make the new ticket a higher priority.
Test Cases Specs
On January 25, 1988, Presiden Ronald Reagan gave the following quote at his last State of the Union address:
"As ancient Chinese philosopher, Lao-tzu, said, "Govern a great nation as you would cook a small fish; do not overdo it.""
The idea behind this quote can be applied to Test Cases for regression. I often see that when initially creating the QA test engineer will overwhelm the test case with complex steps and validations. It's important to keep the test case as simple as possible.
Tips and Tricks
Each test should focus on one test result. (This will make it easier to downgrade tests.)
The test case should piggyback on a parent test. For example, the common path to the functionality being tested. This will make it easier to change future enhancements and make it easy to test that test against other paths that customers may take.
Keep the test as simple as possible. If there's complexity involve consider splitting the test up.
Test Cases should be easy enough that anyone new to the organization could be able to follow the steps. However, don't keep it too simple to miss out on the reason behind the test.
Just like developers have code review, QA should have their test cases reviewed by their peers.
Google Chrome URL Page
Google Chrome has a lot of useful built-in debugging tools. You can access many of these via single page URL Cheat Sheet:
How Does this Help QA?
There are some useful items here for QA. Here's a few that I encountered.
Chrome Flags - This is where you can see all the experiments available in the current browser. This is useful to check on every once in a while. You can find out what new tools are may get released soon and you can play with them.
Accessibility - This allows you to take a deep dive in Chromes Accessibility based on what's loaded up right now. You can find out more about Accessibility testing that's available in Chrome on the Accessibility Page.
User Action - This is useful when you want to see how Chrome responds to certain keyboard/mouse actions.
Inspect - You can setup the Port Forwarding. Useful when you need to debug an issue on a particular server. You don't have to change your system settings, simply enable it at the browser level.
Net-Export - If you need to do a data dump of web activity, this is the place to get it done. Simply enable the logging and perform the activity. A full dump of all the activity will be dump into a JSON file. This is useful when a developer may not be able to reproduce the error on their end, and they may find something in the logs to explain why.
Dino - Everyone's favorite Dino game is available whenever you want.
Fun to Explore
Take some time to check out all 85 links on the Chrome URLs page.
Again, to access the page type this URL in the Browser URL field: chrome://chrome-urls/
Regression Test Planning is like Packing for a trip
This is a follow up to last week's post.
Vacation Packing Experts will tell you take less than you need. Such as this tip from The Blonde Abroad:
Here’s a traveler’s secret. After all that packing, chances are you won’t use half of what you plan to take. So why not save yourself the trouble and leave half behind? A neat trick that I’ve always done is set out all the clothes that I want to bring and definitely think I’ll need and then take just half of them.
How does this relate to QA Regression Test Planning?
When planning the test strategy for a new function to add to the regression test, you should revise the coverage to focus on the most important functionality. When a feature is first created every part is critical but over time some of the features become less critical. When it's time to add the feature to regular regression - as part of the QA Transition Plan - some minor features are treated as critical issues.
Three Tips to Writing a Test Plan
It's QA job to balance the priority of testing the functionality to the amount of time that is available for regression testing. Here are some best practice tips I have learned over the years:
Know What is Important - The QA Team Lead needs to work with the product and Developer Lead on the critical parts of the feature. The group needs to decide what functionality is critical to the product going forward - post-launch. This is where the packing analogy comes in - decide now what is critical to take on the next release journey.
Tag Jira Tickets - When writing the test case, it's a good idea to reference the original Jira Ticket. I can't tell you how many hours have been spent searching in Jira for the QA ticket, just to find the product description or specs. I have found that test tickets may have error log notes that may help debug issues that happen long after the product was released.
After all is said and done more is said than done - I have seen projects that were once mission-critical, become a very low priority in a very short time. Despite all the "sense of urgency" that was initially given, other projects become more important. Ideally, a good QA Lead will follow up with the product to verify the priority of the feature functionality. However, balancing other projects usually distract QA from the ideal process.
Testing is like
Using some of the phrases from WordStream:
Testing is like herding cats. Cats are fiendishly complicated to manage, they eat when they want, sleep when they want, come and go as they please. A cat is going to do what it wants when it wants and there's precious little you can do to change that. QA Testing isn't entirely like that, but it's close.
Weekly Release Testing is like World of Warcraft because getting to green, and staying there, is testing hand-to-hand combat that never ends. The closer you get to the release day, the more cunning and more ferocious the feature testing you must face, and defeat. Release Day is a battlefield littered with the corpses of features that weren't up to the challenge.
Regression testing is like bacon: Everyone loves it. Some people try to say they hate it. That just means they love it even more.
Testing is like a can of Campbell's vegetable beef soup. If you can, think of the juice as the product features; you need the juice to make soup, and you need the features for testing. The vegetables are like testing best practices. The meat is like gray-hat testing. Last you have sodium which is about 85% of our daily value and it's not really that good for us. But it's also what keeps the soup fresh, and that freshness is why I get to keep eating my cans of soup. Similarly, there is a ton of code in the codebase today, but about 25% of the code created is not used on a daily bases. But that code is fresh, and to the end-users, that freshness is just as important as any other code.
Automation is Not
Here we go...
Here are the eight things that people may think what QA Automation is - but they are so wrong:
Automation is not....
- A substitute for manual testing. It should complement the sum of all the testing done in a release cycle.
- Easy. It takes time to build a strong test foundation that can be properly maintained.
- Hard. There are a lot of "quick" hit automation tools. Make sure to have a plan. Don't create tests that just "chicken peck" functionality.
- A One and Done. Automation requires lots of regular maintenance work for feature changes and to build in complexity.
- Testing. It's only for verification of a specific predefined path. Just keep in mind that customers may not take the same approach - this is why manual testing is good.
- One Size Fits All. There are lots of automation solutions available. In the past two companies I worked at, both decided to build their own - just so they could have ownership.
- Just about Selenium. There are other alternatives available. Don't think you're limited to Selenium rules. Check out Katalon Studios, Robot Framework and CasperJS to name a few.
- Abracadabra. It’s not magical and worth QA time to understand the technology behind it so they can get the most out of it.
What do you think of my list? Did I miss anything? Or am I way off base?
In some companies they have an internal company-wide product show-me - otherwise known as Product Palooza.
What is a Product Palooza?
The Product Palooza a chance for Product and Engineering to showcase all the upcoming features that are being worked on. This is usually an internal showcase for Sales and CS teams. It's a good way to communicate upcoming changes.
QA Should have a similar type of event - named Bug Bash. The event would showcase some of the unique bugs that QA has found.
Bug Bash target audiences would be for the Product and Engineering teams.
Some Example Exhibits:
- Most Complex Bug Found
- Tools that QA Uses to Find Bugs
- Tips and Tricks on Testing your own code
- Testing Technology and Techniques
- Testing Challenges and Ways They are
A Bug Bash can help QA educate the engineering team of the various testing challenges. It can be used for the Product team to help understand planning and test time allocation.
Managing Testing around furloughs and layoffs
Many companies are cutting expenses and implementing short term furloughs and layoffs. This means that QA has to balance testing with smaller teams.
Some ideas to make this process less impacting to the team
- Documentation - make sure that everyone update their smoke testing and regression test steps.
- Schedule - if the team is going on furlough on different weeks make sure that everyone knows who going to be out and when.
- Regression Testing Test - make sure that the test case repository is updated with the latest test. Particularly make sure the test steps are up to date.
- Ticket Testing - Make sure testing steps are clear if tickets are being released while the tester is out. If there were any new discovery while testing a ticket - it’s critical to make sure that it’s well documented.
- Wiki Page Update - Make sure that project base QA Reference pages are updated with the current contact and release plan. (More information on QA Reference Page in a different post. )
- Product Review Meeting - After layoffs occur it’s a good idea to meet up with the Product team and go over regression planning. With a smaller team, it’s time to rethink testing priorities. The product can help lay the groundwork on what areas should get the most focus. In some businesses, it might be good to include customer support as they may have ideas on areas of the product that gets the most attention.
QA Tag Lines
Everyone once in a while I encounter a situation where we need a good QA tag line. It might be for a marketing event or just something to spice the mood up around the office.
My coworkers will come up with some creative content, and some are good and some need a little bit more creativity. At the end of the day, I appreciate the thoughtfulness and work that they put in each submission.
Creative Tag Lines
Here are some suggested tag lines for any QA team. Some of these are based on classic commercial tag lines.
- It Matters
- Good Enough is not Good Enough
- Quality Up
- Question Everything
- Quality is Job 1
- What could possibly go wrong?
- A little science...a little magic...a little quality.
- Say "No" to bugs.
- Make it Right - Mike Holmes
- It's time to get serious with QA
Got a Tagline?
If you know of any other tag lines, please share. We are always looking for something new.
Enable Reader Mode in Chrome
There's a little known feature in the desktop version of Chrome - the ability to see only the text on webpages. This has been available in Safari for years. Google finally has decided to bring it into Chrome.
The feature is "neat" because you can basically eliminate ads from articles, which can be a distraction some times. In some cases, you can bypass the paywall and see the article that is blocked using the conventional way.
Enable the Feature
To enable the feature you need to open Chrome and go to this URL:
This is what you should see:
Use the pull-down menu on the right of the "Enable Reader Mode"
Then click on the blue "Relaunch" button and Chrome Browser will restart.
Toggle Reader Mode
Now when you visit some websites, you'll see a new icon to the right of the URL, now click on it and you'll have the same reader mode as you get in Safari.
Fixed some Classic Posts
I spent some time today reviewing some old blog posts. Some of these were old and needed some refreshing updates.
I fixed up:
Here are my notes on these posts:
- Fixed a lot of the spelling and misc sentences that just didn't make sense. Maybe it made sense when I was putting it together, but definitely not now.
- Updated the images so that the post looks good on Desktop and Mobile
- DIdn't add any additional content - I would like to update the pictures, as when they were taken I wasn't thinking of posting it.
- Corrected data from comments that people left in the post
- Updated some basic phrases
The best way to write up ticket testing - for QA - is two simple steps:
- What do you want QA to test?
- What is the expected outcome?
Example Test Situation
As a new user, go to Google.com and click on the Sign In Button
You should see that: English (United States) should be the default language
Simple Steps To Get
By using these steps, it gives clear guidance to the QA Team on how to properly test the code change. The "Expected Results" helps clarify the specific change that was done.
Note: QA is still free to do exploratory testing around the change - time permitted. The expected results help clarify the specific changes that were made.
In the Past…
I have encountered a lot of issues that had vague testing descriptions that lead to a lot of unnecessary interactions between QA and Dev. Basically it wasn't clear exactly what was changed and how the functionality should work.
Find Empty Name Ids
One of the challenges with writing any QA automation is XPaths. They are essential to finding elements on a page. You need them to verify functionality or to take some action - such as clicking a button.'
The XPath helps automation find that location, it's primarily built using name ids. The reason name ids are important is because they are supposed to be unique on a page.
More often than not, Developers will leave out adding name ids. This means that QA has to use the full XPath location. This now makes the automation code risky because if someone makes a simple change to the page - such as adding a new element, the automation test may fail. This is because the XPath flow is broken.
Use JQuery to Help with Ids
If you're testing a site that uses jQuery, you can run this simple script to find all the areas of the page that don't have a name ID:
$('*:not([id]):not([class])').css("border","2px solid red");
To use this simply open up the Chrome console panel and paste in the above code. If the page element doesn't have an ID or class, the element will get highlighted in red.
Not every single DIV tag needs a class or ID, but this is a good technique to find out where they are missing on the page.
Here's the code in a bookmarklet format, so that you can run it whenever you want:
If you know about Bookmarklets, then you know that you can simply Drag/Drop this to your Bookmark toolbar: Check Name IDs
Knowlegde is Powerful
Knowing the availability of empty DIVs without a name ID can help with the automation process - QA can now request an ID in all the key points in the web code.
Simply take a screen shot and ask Devs to put name IDs where needed.
Webpage Spell-Check Extension
Catching misspelled words can be tricky and not a fun job for QA to do. Sometimes it's easy to find a misspelled word when the misspelled word seems obvious - but more often than not you may not catch the words.
Four Things that Makes This a Practical Tool
- You can instantly check the spelling on any page. Doesn't matter if the page is on a secure location or on your hard drive. Om
- You can enable this so that it's always on when your testing certain domains.
- You can edit the content on the page using the Control Key - that way you can test to make sure that correct spelled word doesn't break the design of the site.
- The application uses Google Chrome spell checker. As you add words to the dictionary in other applications, the word won't get flagged by Spell-Check as incorrect.
This is a pretty cool tool, and once you have it set up - you can ignore it until you suddenly see some words with the red lines under them.
This allows you to simply check off "Website Spell Check" as a value add service by your QA team.