The Independent’s journalism is supported by our readers. When you purchase through links on our site, we may earn commission. Why trust us?
Virtual Private Networks (VPNs) have become a popular tool for anyone aiming to improve anonymity and freedom whilst browsing the web. With a vast array of VPN providers on the market, choosing the right one can be difficult.
Our team of technology experts take great care to fully test and review VPNs. First and foremost, we want to ensure you get the best value VPN for your needs, but more than that, giving a recommendation to our readers for an online security product that isn’t up to scratch could have serious consequences, such as leaked or stolen information.
Here’s a guide for our comprehensive testing methodology, explaining how we dissect and evaluate VPN services. We’ll look at the key factors that are considered in-depth, ensuring you make an informed decision that suits your needs.
Each overall VPN score is derived from a weighted average using eight separate scoring factors. Our best VPN lists uses objective and data-driven tests for scoring, and it assesses the factors and sub-factors that we believe are most important for anyone considering using a VPN. These tests include 500+ speed tests across MacOS, Windows, iOS, and Android devices. In addition, we consider a number of key aspects when comparing VPNs. The breakdown is as follows:
Our VPN tests compare performance across eight critical factors: privacy & security, speed, pricing, content access, usability, features, customer service, and the company.
Each of these factors are weighted based on the level of importance. For instance, privacy and security (25%) is more important than content access (12.5%). These factors are then combined to calculate the total score for each service. Both our individual factors and our overall VPN scores are rounded up to the second decimal place.
We also track and consider a number of non-scoring factors which help to inform our reviews and listings. For example, we test each VPN in terms of reliability, testing streaming capabilities by watching 4K content on Fire Stick devices. We track the total number of subscriptions offered as part of the pricing criteria, as well as the number of servers in relation to features. These are important to test and discuss within our reviews and roundups, but don’t contribute to the overall score. For example, the number of servers doesn’t necessarily equate to a better service, or faster speeds. Providers like Mullvad offer a simple pricing plan with a sole monthly subscription. More options are generally better, but it’s not always cut-and-dried.
We use a simple 1-10 grading scale, using two decimal places to further differentiate between each VPN. Here’s a rundown of the scores and what they mean:
10.0 | Perfect score |
9.00 – 9.99 | Excellent score |
8.00 – 8.99 | Great score |
7.00 – 7.99 | Decent score |
6.00 – 6.99 | Mediocre score |
5.00 – 5.99 | Poor score |
4.00 – 4.99 | Awful score |
Anything scoring below 4.00 should also be seen as terrible.
Each VPN is assessed using the same criteria, and given a relevant score for each section. These are weighted and used for the overall scores.
We’re constantly striving to improve our testing process, which helps to ensure accurate results and listings, and helpful VPN recommendations. This includes regular updates to improve our methodology and scoring.
Our current scoring methodology is version 1.0.
Version 1.0 represents a shift to a data-driven, qualitative approach, using objective data when possible. To begin, we whittled down 35 providers to a shortlist of 10 VPNs for more testing.
Scoring criteria like content access is relatively straightforward, as it’s a case of checking whether or not the VPN works with a specific service. For other overall scores, we’ve weighted various sub-factors based on overall importance. These will be discussed below.
Our lead cybersecurity expert Rafay Baloch has worked to put together a robust methodology looking at everything to do with privacy and security. It took 50 hours alone for Rafay to plan and conduct the following tests on each provider.
The criteria is as follows:
All of the sections above are divided into further subsections. For example, Transparency (20% overall) looks for any independent pentest reports (6%), along with a providers no-logs policy (7%). Transparency also considers the frequency of these audits (2%), the frequency of pentest reports (2%), whether the VPN keeps any timestamps (2%), and checks whether privacy impact assessment reports (1%) are available.
Every section has been split similarly, and includes leak tests, and much more.
We’ve conducted thousands of speed tests across a long list of devices over the past 12 months. As we’re aiming to account for performance across numerous different operating systems, we’ve tested speeds for each provider listed across four key operating systems.
The overall Speed score is a weighted average taken from the results seen whilst testing each VPN’s performance whilst connected to servers in the UK, the US, and Australia. To get a more accurate representation, each server is tested three times, with six-hour intervals.
Speed tests are conducted via MacOS, Windows, Android, and iOS devices. These are the most popular operating systems in the UK, and should account for the majority of desktop and mobile users.
Reliability might be a non-scoring factor, but it helps to provide background information regarding the capabilities of each service. Our Reliability tests involve checking to see whether the VPN is capable of streaming 4K content using a Fire Stick device. These tests are conducted for an hour each, noting any lag or drop in latency.
Rather than solely considering the price, we’ve aimed to look at which VPN offers the best value for money. The current price is a good place to start, looking at the average cost for yearly coverage. This is likely to be the most important aspect for any potential user, so it’s weighted at 50% of the total Pricing score. A solid refund policy (20%) is the hallmark of any trustworthy VPN, and we factor in the length of free trials (10%), the existence of free versions (10%), and the money saved via the best promotions (10%).
We selected 10 of the most popular streaming services in the UK for the content access section. To come up with the list, we looked at popularity within the Apple App Store, as well as the Google Play Store. To date, these are as follows:
Testing for each streaming service was conducted using a desktop browser, rather than a specific app.
Usability has been divided into a trio of different sections.
Design and UX relates to how long it takes to switch to a new server, as well as tracking how many clicks it takes to connect to the VPN once the app has been opened.
For the jargon section, we looked at each VPN’s Flesch score, looking at the overall readability of the homepage. Providers should be able to explain more technical aspects simply.
Compatibility relates to the total number of features found across Windows, MacOS, iOS, and Android devices. A feature can be anything from split tunnelling to obfuscation.
Features are important for any VPN. For features like the kill switch, malware protection, the ad blocker, Multihop, and split tunnelling, we checked whether these features were available on all core devices (MacOS, Windows, iOS, and Android). Full marks are earned for offering these features across every device.
Countries with coverage are also important, especially if you plan to access services found in a different region. We’ve also looked at the number of servers, although it’s a non-scoring factor given it doesn’t give an indication of overall performance. 10% of the overall Features score relates to the number of distinct locations on offer, this includes cities like London, Glasgow, and Manchester when looking at server locations within the UK. We also checked out parental controls, although it’s a non-scoring factor.
Given some VPN companies have proven that they can’t be trusted, it makes sense to consider the organisation that will be looking after your personal data in-depth. The sub-factors are as follows:
We consider whether the VPN has made any infractions with regulators such as the ASA and the ICO to build up a trust profile, while full marks are earned if the location is based outside of the EU, and isn’t a Five Eyes country. It’s worth listening to user complaints, so we tallied feedback found on services like TrustPilot. Given the vast majority of these reviews couldn’t be verified, we decided that it was fair to keep it as a non-scoring factor.
Customer service is especially important for new users, and will likely be your first port of call if you’re aiming to get a refund, or would like to find out more about the service. We’ve looked at the following data points for this section:
Average response times give a better idea of what to expect when contacting customer support. These are tested using live chat support, checking how long it takes to speak to a real person. If live chat isn’t available, we log response times via email.
Ideally, support staff will be able to provide a satisfactory resolution. We asked them questions on three separate occasions, relating to available features, how to get a refund, and which servers are best used for streaming.
Given there are over 1,000 data points, and lots of different calculations to consider, there is a possibility that we have made an error or two along the way. The methodology and the data itself has been checked multiple times to ensure accuracy, but we’d be happy to explain our scoring, and we’ll correct any errors accordingly.
Do you have any suggestions or comments about our methodology and scoring system? Get in touch with us if you have any feedback.