httpx Proxy Configuration

← Back to Python Libraries

httpx is a modern, fully-featured HTTP client for Python 3 with both synchronous and asynchronous APIs, plus support for HTTP/1.1 and HTTP/2. If you know Requests, you'll find httpx familiar but with additional capabilities for modern applications.

httpx has built-in proxy support through its proxy parameter, making it easy to route traffic through ProxyMesh rotating proxies. This guide shows you how to configure httpx for web scraping, async data collection, and API integrations.

Why Choose httpx?

  • Async and sync APIs - Use the same patterns for both
  • HTTP/2 support - Better performance with multiplexed connections
  • Type hints - Full typing for better IDE integration
  • Familiar API - Similar to Requests
  • Native proxy support - Easy configuration via proxy parameter

Installation

Install httpx using pip:

pip install httpx

For HTTP/2 support, also install:

pip install httpx[http2]

httpx has built-in proxy support—no additional packages needed to get started.

Basic Proxy Configuration

httpx supports proxies through the proxy parameter on the client:

import httpx

# ProxyMesh proxy with authentication
proxy_url = "http://username:password@PROXYHOST:PORT"

with httpx.Client(proxy=proxy_url) as client:
    response = client.get("https://api.ipify.org?format=json")
    print(response.json())  # {"ip": "..."}

Each request through ProxyMesh may use a different IP address from the rotation pool.

Async Proxy Requests

httpx's async client uses the same API for concurrent requests:

import httpx
import asyncio

async def fetch_with_proxy():
    proxy_url = "http://username:password@PROXYHOST:PORT"
    
    async with httpx.AsyncClient(proxy=proxy_url) as client:
        response = await client.get("https://api.ipify.org?format=json")
        print(response.json())

asyncio.run(fetch_with_proxy())

The async client is ideal for making many concurrent requests through rotating proxies.

Environment Variables

httpx respects standard proxy environment variables:

export HTTPS_PROXY="http://user:pass@PROXYHOST:PORT"
import httpx

# Automatically uses proxy from environment
with httpx.Client() as client:
    response = client.get("https://api.ipify.org?format=json")

IP Authentication

With your IP whitelisted in ProxyMesh, no credentials needed:

import httpx

with httpx.Client(proxy="http://PROXYHOST:PORT") as client:
    response = client.get("https://api.ipify.org?format=json")
    print(response.json())

Custom Proxy Headers

httpx can send custom headers to the proxy using httpx.Proxy:

import httpx

proxy = httpx.Proxy(
    "http://username:password@PROXYHOST:PORT",
    headers={"X-ProxyMesh-Country": "US"}
)

with httpx.Client(proxy=proxy) as client:
    response = client.get("https://api.ipify.org?format=json")
    print(response.json())

To also receive proxy response headers (like X-ProxyMesh-IP), use the python-proxy-headers extension:

pip install python-proxy-headers
import httpx
from python_proxy_headers.httpx_proxy import HTTPProxyTransport

proxy = httpx.Proxy(
    "http://user:pass@PROXYHOST:PORT",
    headers={"X-ProxyMesh-Country": "US"}
)
transport = HTTPProxyTransport(proxy=proxy)

with httpx.Client(mounts={"https://": transport}) as client:
    response = client.get("https://api.ipify.org?format=json")
    print(f"Routed through: {response.headers.get('X-ProxyMesh-IP')}")

See the python-proxy-headers documentation for async support and more options.

Common Use Cases

Concurrent Async Scraping

import httpx
import asyncio

async def fetch_url(client, url):
    response = await client.get(url)
    return {"url": url, "status": response.status_code}

async def scrape_many(urls):
    proxy_url = "http://user:pass@PROXYHOST:PORT"
    
    async with httpx.AsyncClient(proxy=proxy_url) as client:
        tasks = [fetch_url(client, url) for url in urls]
        results = await asyncio.gather(*tasks)
        return results

urls = [f"https://httpbin.org/anything/{i}" for i in range(20)]
results = asyncio.run(scrape_many(urls))

for r in results:
    print(f"{r['url']}: {r['status']}")

Multiple Proxy Locations

import httpx

PROXY_LOCATIONS = {
    "us": "PROXYHOST:PORT",
    "uk": "PROXYHOST:PORT",
    "de": "PROXYHOST:PORT",
}

def fetch_from_location(url, location):
    proxy = f"http://user:pass@{PROXY_LOCATIONS[location]}"
    with httpx.Client(proxy=proxy) as client:
        return client.get(url)

us_response = fetch_from_location("https://api.ipify.org", "us")
uk_response = fetch_from_location("https://api.ipify.org", "uk")

print(f"US IP: {us_response.text}")
print(f"UK IP: {uk_response.text}")

POST with JSON

import httpx

with httpx.Client(proxy="http://user:pass@PROXYHOST:PORT") as client:
    response = client.post(
        "https://httpbin.org/post",
        json={"key": "value"}
    )
    print(response.json())

Timeout Configuration

import httpx

timeout = httpx.Timeout(connect=10.0, read=30.0)

with httpx.Client(
    proxy="http://user:pass@PROXYHOST:PORT",
    timeout=timeout
) as client:
    response = client.get("https://example.com")

HTTP/2 with Proxy

import httpx

with httpx.Client(
    proxy="http://user:pass@PROXYHOST:PORT",
    http2=True
) as client:
    response = client.get("https://http2-enabled-site.com")
    print(f"HTTP version: {response.http_version}")

Streaming Downloads

import httpx

with httpx.Client(proxy="http://user:pass@PROXYHOST:PORT") as client:
    with client.stream("GET", "https://example.com/large-file.zip") as response:
        with open("download.zip", "wb") as f:
            for chunk in response.iter_bytes():
                f.write(chunk)

ProxyMesh Headers Reference

Send these headers to control proxy behavior:

  • X-ProxyMesh-Country - Route through a specific country (e.g., "US"). Only works with world proxy or open proxy
  • X-ProxyMesh-IP - Request a specific outgoing IP address
  • X-ProxyMesh-Not-IP - Exclude specific IPs from rotation

The proxy returns X-ProxyMesh-IP in the response with the IP address used.

Resources

Related Python Proxy Guides

Explore proxy configuration for other Python HTTP libraries:

  • Requests - The most popular Python HTTP library
  • aiohttp - High-performance pure async HTTP client
  • urllib3 - Low-level HTTP with connection pooling

Start Free Trial