Rampage Logo

Rampage Blogs

Using Proxies With Python Requests

Published: 2023-09-29

Written By Owen Crisp

If you're using Python to scrape the web and have experienced issues with blocks, you may look to use proxies to help circumvent those. Proxies allow you to remain anonymous by hiding your original location and IP. They also allow you to rotate your IP address to avoid blocks and geolocation issues!


To begin, you will need Python 3 installed on your device. You will also need proxies, purchasable from our dashboard here.

For this project, you'll need to reformat the proxies from the way they're generated from our dashboard. If you didn't know already, all our proxies are currently generated in this format:


However when using proxies in python, a slight reformat is required:


You can do this manually or with the use of python if you have a list of proxies:  

proxies = []

with open("proxies.txt", "r") as f:
    proxylist = f.read().splitlines()
    for proxy in proxylist:
        ip, port, user, pw = proxy.split(":")

  We can now begin writing our main script to make requests.

Building The Script

Begin by importing the HTTP library:  

import requests

  After this, you'll need to generate and reformat the proxies as mentioned earlier. You'll need to define a dictionary with the proxy URL's associated with both the HTTP and HTTPS protocols:  

proxies = {
        'http': random.choice(proxies),
        'https': random.choice(proxies),

  Once defined, input a target URL. This will be where the script aims at:  

url = 'https://ipinfo.io/'

  After this, we'll define the response. This will utilise a GET request to target the website defined in URL with the proxies we pre defined in the dictionary earlier.  

response = requests.get(url, proxies=proxies)

  Finally, you'll need to add a response. You can verify this works simply by doing:  


  By printing this response, you are essentially asking the page to return the status code stored in response, which in this case should be 200. Status code 200 is the acknowledgement of a successful request where it has been received, understood, and returned. We can use this as a very quick method to test whether a proxy is working.

However, you can print slightly differently by parsing the response as a JSON:  


  This will print the response payload as a json.

This code is an example of how to use proxies with the requests library to access web resources through a proxy server, which can be useful for various networking and web scraping tasks, especially when dealing with restricted or anonymized access to websites.

Full Code Example


import random
import requests

proxies = []

with open("proxies.txt", "r") as f:
    proxylist = f.read().splitlines()
    for proxy in proxylist:
        ip, port, user, pw = proxy.split(":")

try : 
    proxies = {
        "http": random.choice(proxies),
        "https" : random.choice(proxies)

    url = "https://ipinfo.io/json"
    r = requests.get(url, proxies=proxies)
    print(f"Status Code: {r.status_code}, Proxy Location: {r.json()['country']} IP: {r.json()['ip']}")

except Exception as e:
    print(f"Request Failed - {e}")



This step-by-step tutorial covered the most important lessons about proxies with requests in Python. Proxies help you bypass anti-bot systems, and Rampage offers the choice of every major upstream provider at the most competitive rates, all available on a sleek dashboard and underlying developer API.

Why Rampage is the best proxy platform

Unlimited Connections and IPs

Limitations are a thing of the past. Supercharge your data operations with the freedom to scale as you need.

Worldwide Support

From scraping multiple web targets simultaneously to managing multiple social media and eCommerce accounts – we’ve got you covered worldwide.

Speedy Customer Support

We offer 24/7 customer support via email and live chat. Our team is always on hand to help you with any issues you may have.

Digital Dashboard

Manage all of your proxy plans on one dashboard - no more logging into multiple dashboards to manage your proxies.