aiohttp Proxy Configuration

← Back to Python Libraries

aiohttp is a high-performance async HTTP client/server framework for Python built on asyncio. When you need to make thousands of concurrent HTTP requests, aiohttp delivers the best performance of any Python HTTP library.

aiohttp has built-in proxy support, making it easy to route traffic through ProxyMesh rotating proxies. Combined with async concurrency, you can make thousands of simultaneous requests through different IP addresses.

Why aiohttp for Web Scraping?

  • True concurrency - Make hundreds of requests simultaneously
  • Connection pooling - Efficiently reuse connections
  • Memory efficient - Handle thousands of connections with minimal memory
  • Native proxy support - Built-in proxy parameter
  • Production ready - Used by major companies for high-volume data collection

Installation

Install aiohttp using pip:

pip install aiohttp

aiohttp has built-in proxy support—no additional packages needed to get started.

Basic Proxy Configuration

aiohttp supports proxies through the proxy parameter in request methods:

import aiohttp
import asyncio

async def fetch_with_proxy():
    proxy_url = "http://username:password@PROXYHOST:PORT"
    
    async with aiohttp.ClientSession() as session:
        async with session.get(
            "https://api.ipify.org?format=json",
            proxy=proxy_url
        ) as response:
            data = await response.json()
            print(data)  # {"ip": "..."}

asyncio.run(fetch_with_proxy())

Each request through ProxyMesh may use a different IP address from the rotation pool.

Concurrent Requests

Make many requests simultaneously through rotating proxies:

import aiohttp
import asyncio

async def fetch_url(session, url, proxy):
    async with session.get(url, proxy=proxy) as response:
        return {"url": url, "status": response.status}

async def scrape_many(urls):
    proxy = "http://user:pass@PROXYHOST:PORT"
    
    async with aiohttp.ClientSession() as session:
        tasks = [fetch_url(session, url, proxy) for url in urls]
        return await asyncio.gather(*tasks)

urls = [f"https://httpbin.org/anything/{i}" for i in range(50)]
results = asyncio.run(scrape_many(urls))

print(f"Fetched {len(results)} URLs")

Proxy Authentication Methods

import aiohttp
import asyncio

async def auth_examples():
    async with aiohttp.ClientSession() as session:
        # Method 1: Credentials in URL
        async with session.get(
            "https://api.ipify.org",
            proxy="http://user:pass@PROXYHOST:PORT"
        ) as response:
            print(await response.text())
        
        # Method 2: BasicAuth object
        auth = aiohttp.BasicAuth("username", "password")
        async with session.get(
            "https://api.ipify.org",
            proxy="http://PROXYHOST:PORT",
            proxy_auth=auth
        ) as response:
            print(await response.text())

asyncio.run(auth_examples())

Environment Variables

aiohttp reads proxy settings from environment with trust_env=True:

export HTTPS_PROXY="http://user:pass@PROXYHOST:PORT"
import aiohttp
import asyncio

async def use_env_proxy():
    async with aiohttp.ClientSession(trust_env=True) as session:
        async with session.get("https://api.ipify.org") as response:
            print(await response.text())

asyncio.run(use_env_proxy())

Custom Proxy Headers

aiohttp can send custom headers to the proxy using proxy_headers:

import aiohttp
import asyncio

async def with_proxy_headers():
    async with aiohttp.ClientSession() as session:
        async with session.get(
            "https://api.ipify.org?format=json",
            proxy="http://user:pass@PROXYHOST:PORT",
            proxy_headers={"X-ProxyMesh-Country": "US"}
        ) as response:
            print(await response.json())

asyncio.run(with_proxy_headers())

To also receive proxy response headers (like X-ProxyMesh-IP), use the python-proxy-headers extension:

pip install python-proxy-headers
import asyncio
from python_proxy_headers import aiohttp_proxy

async def with_response_headers():
    async with aiohttp_proxy.ProxyClientSession() as session:
        async with session.get(
            "https://api.ipify.org?format=json",
            proxy="http://user:pass@PROXYHOST:PORT",
            proxy_headers={"X-ProxyMesh-Country": "US"}
        ) as response:
            print(f"IP: {response.headers.get('X-ProxyMesh-IP')}")

asyncio.run(with_response_headers())

See the python-proxy-headers documentation for more details.

Common Use Cases

High-Volume Concurrent Scraping

import aiohttp
import asyncio

async def fetch_with_semaphore(session, url, proxy, semaphore):
    async with semaphore:
        async with session.get(url, proxy=proxy, timeout=30) as response:
            return {"url": url, "status": response.status}

async def scrape_many(urls, max_concurrent=50):
    proxy = "http://user:pass@PROXYHOST:PORT"
    semaphore = asyncio.Semaphore(max_concurrent)
    
    async with aiohttp.ClientSession() as session:
        tasks = [fetch_with_semaphore(session, url, proxy, semaphore) for url in urls]
        return await asyncio.gather(*tasks, return_exceptions=True)

urls = [f"https://httpbin.org/anything/{i}" for i in range(200)]
results = asyncio.run(scrape_many(urls))

success = sum(1 for r in results if isinstance(r, dict))
print(f"Successful: {success}/{len(urls)}")

Multiple Proxy Locations

import aiohttp
import asyncio

PROXIES = {
    "us": "http://user:pass@PROXYHOST:PORT",
    "uk": "http://user:pass@PROXYHOST:PORT",
    "de": "http://user:pass@PROXYHOST:PORT",
}

async def fetch_from_locations():
    async with aiohttp.ClientSession() as session:
        for location, proxy in PROXIES.items():
            async with session.get("https://api.ipify.org", proxy=proxy) as r:
                print(f"{location}: {await r.text()}")

asyncio.run(fetch_from_locations())

POST Requests

import aiohttp
import asyncio

async def post_with_proxy():
    proxy = "http://user:pass@PROXYHOST:PORT"
    
    async with aiohttp.ClientSession() as session:
        async with session.post(
            "https://httpbin.org/post",
            proxy=proxy,
            json={"key": "value"}
        ) as response:
            print(await response.json())

asyncio.run(post_with_proxy())

Error Handling and Retries

import aiohttp
import asyncio

async def fetch_with_retry(session, url, proxy, max_retries=3):
    for attempt in range(max_retries):
        try:
            async with session.get(
                url,
                proxy=proxy,
                timeout=aiohttp.ClientTimeout(total=30)
            ) as response:
                if response.status == 200:
                    return await response.json()
                elif response.status == 429:
                    await asyncio.sleep(2 ** attempt)
                    continue
        except aiohttp.ClientError as e:
            if attempt == max_retries - 1:
                raise
            await asyncio.sleep(1)
    return None

async def main():
    proxy = "http://user:pass@PROXYHOST:PORT"
    async with aiohttp.ClientSession() as session:
        result = await fetch_with_retry(session, "https://api.ipify.org?format=json", proxy)
        print(result)

asyncio.run(main())

Streaming Large Downloads

import aiohttp
import asyncio

async def download_file():
    proxy = "http://user:pass@PROXYHOST:PORT"
    
    async with aiohttp.ClientSession() as session:
        async with session.get("https://example.com/file.zip", proxy=proxy) as response:
            with open("download.zip", "wb") as f:
                async for chunk in response.content.iter_chunked(8192):
                    f.write(chunk)

asyncio.run(download_file())

ProxyMesh Headers Reference

Send these headers to control proxy behavior:

  • X-ProxyMesh-Country - Route through a specific country (e.g., "US"). Only works with world proxy or open proxy
  • X-ProxyMesh-IP - Request a specific outgoing IP address
  • X-ProxyMesh-Not-IP - Exclude specific IPs from rotation

The proxy returns X-ProxyMesh-IP in the response with the IP address used.

Resources

Related Python Proxy Guides

Explore proxy configuration for other Python HTTP libraries:

  • httpx - Modern HTTP client with both sync and async APIs
  • Requests - Simple synchronous HTTP library
  • Scrapy - Full-featured web crawling framework

Start Free Trial