Have a question?

Contact Us

How to scrape real estate listings from Rightmove?

5 min read

Rightmove Scraper

Web scraping 8 min read  ·  Published: 14/01/2020

Looking for a Rightmove scraper in Python? Rightmove is the first real estate platform in the UK, with over 800,000 active listings at any given time. This Rightmove scraper Python guide shows you how to extract structured property data at scale — both rental and purchase listings — using ScrapingBot's API.

1. Why scrape Rightmove?

The UK property market is one of the most active in Europe. Rightmove alone receives over 100 million visits per month. Scraping it gives you access to:

  • Price trends by city, postcode, or neighborhood
  • Rental vs. purchase market comparisons across the UK
  • Agency and agent contact details
  • Publishing dates to spot negotiation opportunities on older listings
  • UK-specific data like weekly vs. monthly rental prices and bedroom counts

We've already covered scraping Funda in the Netherlands and Zillow in the US — Rightmove follows the same pattern but with UK-specific formats (GBP, bedrooms instead of surface area).

2. Technical challenges

Rightmove uses several layers of protection that make straightforward scraping difficult:

  • Bot detection — Rightmove blocks headless browsers and suspicious request patterns.
  • JavaScript rendering — Listing data is loaded dynamically and not present in the raw HTML.
  • IP rate limiting — Repeated requests from the same IP trigger blocks quickly.
  • Price format variations — Rental prices can be per week or per month, requiring normalization.

3. How ScrapingBot handles them

ScrapingBot's Real Estate API handles all of this automatically: it rotates UK-based IPs, renders JavaScript fully, and returns clean structured JSON — whether you're scraping rental or purchase listings on Rightmove.

4. Step-by-step: build your Rightmove scraper in Python

Install the library

pip install requests

Basic setup

import requests

# Your ScrapingBot credentials
USERNAME = "your_username"
API_KEY  = "your_api_key"

def scrape_rightmove(url):
    api_url = "https://api.scraping-bot.io/scrape/real-estate"
    payload = {"url": url}

    response = requests.post(
        api_url,
        json=payload,
        auth=(USERNAME, API_KEY)
    )

    if response.status_code == 200:
        return response.json()
    else:
        raise Exception(f"Error {response.status_code}: {response.text}")

5. Scraping rental listings

Let's start with a rental listing in Hackney, London. Rightmove rental pages include the monthly or weekly rent, number of bedrooms, agency contact, and publishing date.

import requests, time

# Scrape rental listings in London
RENTAL_URL = "https://www.rightmove.co.uk/property-to-rent/find.html?locationIdentifier=REGION%5E87490"

def scrape_rentals(n_pages=3):
    results = []
    for page in range(n_pages):
        url = f"{RENTAL_URL}&index={page * 24}"
        data = scrape_rightmove(url)
        results.extend(data.get("listings", []))
        time.sleep(1)  # polite delay
    return results

rentals = scrape_rentals()
print(f"Collected {len(rentals)} rental listings")

Here's what a typical rental listing returns:

FieldExample valueType
titleStudio flat to rent in Hackney, Londonstring
monthly_rent£1,450 pcmstring
weekly_rent£335 pwstring
bedrooms0 (studio)integer
agencyFoxtons Hackneystring
agency_phone020 7000 0000string
listing_date2024-01-08string
descriptionA well-presented studio flat...string

6. Scraping purchase listings

For purchase listings, the structure is similar but includes the sale price and number of bedrooms — a key metric in UK property ads, where surface area in sqm is rarely specified.

# Scrape purchase listings in London
BUY_URL = "https://www.rightmove.co.uk/property-for-sale/find.html?locationIdentifier=REGION%5E87490"

def scrape_purchases(n_pages=3):
    results = []
    for page in range(n_pages):
        url = f"{BUY_URL}&index={page * 24}"
        data = scrape_rightmove(url)
        results.extend(data.get("listings", []))
        time.sleep(1)
    return results

purchases = scrape_purchases()
print(f"Collected {len(purchases)} purchase listings")

Here's what a typical purchase listing returns:

FieldExample valueType
title3 bedroom house for sale in Hackneystring
price£650,000string
bedrooms3integer
bathrooms2integer
agencySavills Londonstring
agency_phone020 7000 1111string
listing_date2023-11-20string
days_on_market62integer

7. UK-specific data formats

One particularity of Rightmove compared to Funda or Zillow: UK listings rarely mention the total surface area in sqm. Instead, the number of bedrooms is the primary size indicator. Rental prices also come in two formats — per calendar month (pcm) and per week (pw) — so it's worth normalizing them before analysis:

import pandas as pd

def normalize_rent(listings):
    for listing in listings:
        # Convert weekly rent to monthly if needed
        if "pw" in listing.get("rent", ""):
            weekly = float(listing["rent"].replace("£","").replace(" pw","").replace(",",""))
            listing["monthly_rent_gbp"] = round(weekly * 52 / 12, 2)
        elif "pcm" in listing.get("rent", ""):
            monthly = float(listing["rent"].replace("£","").replace(" pcm","").replace(",",""))
            listing["monthly_rent_gbp"] = monthly
    return listings

df = pd.DataFrame(normalize_rent(rentals))
df.to_csv("rightmove_rentals.csv", index=False)
print(df.head())

8. Going further

Once your Rightmove scraper Python script is running, you can combine it with Zillow and Funda data to build cross-market dashboards comparing UK, US, and Dutch property prices. Use pandas for data normalization and schedule your scraper with a cron job to track price changes over time. ScrapingBot supports all three platforms with the same API interface.

Ready to try it? Get 1,000 free API calls when you sign up for ScrapingBot.

Try ScrapingBot for free →

Looking for something more specific?

Start using ScrapingBot

Ready to Unlock Web Data?
Data is only useful once it’s accessible. Let us do the heavy lifting so you can focus on insights.