Dev Log
Shopify App Dev
November 6, 2025

Cloud Proxy API Service Deployment Guide

Shawn Shen avatar
Shawn Shen
Founder of Selofy
min read
Self-hosted cloud proxy services provide the robust, reliable, and secure foundation required for production-level API integration. Proper security implementation (API key validation), aggressive timeout configuration, and comprehensive client-side error handling are crucial for success.

Watch Video Tutorial

Background Problem

When developing Shopify apps that integrate with external services like Google APIs, developers frequently face network and stability challenges:

  • Local Development Limitations: Direct API calls from localhost often fail due to network security restrictions.
  • Tunnel Instability: Development tunnels (like Ngrok) can be unreliable for handling consistent, high-volume API requests.
  • Geographic Restrictions: Server location heavily impacts latency and compliance for specific APIs.
  • Rate Limiting: Shared development IP addresses can quickly hit API rate limits.

The definitive solution is to deploy a dedicated, self-hosted cloud proxy service that routes API requests through a stable, persistent server.

Vercel Deployment Challenges

Initial attempts to deploy this service on serverless platforms like Vercel failed due to fundamental limitations:

1. Serverless Function Constraints

  • Strict Timeout Limit: Serverless functions are typically limited to a short execution time (e.g., 10 seconds), insufficient for long-running or batch API requests.
  • Cold Start Delays: Latency incurred by cold starts severely impacts the responsiveness of the proxy.
  • Memory Limitations: Constraints on memory make processing large API responses difficult.

2. Network and Security Issues

  • Non-Persistent Connections: Lack of persistent connections between function invocations.
  • Inconsistent IP Addresses: The dynamic nature of serverless execution makes IP whitelisting for API access impossible.

3. Cost and Control Problems

  • High Per-Request Cost: Cost models are unsuitable for frequent, high-volume proxy calls.
  • Limited Control: Lack of full control over the execution environment, logging, and performance tuning.

These issues necessitate a self-hosted, containerized approach for a reliable production proxy.

Self-Hosted Server Deployment

Prerequisites

  • VPS or dedicated server with a public IP (e.g., AWS EC2, DigitalOcean Droplet)
  • Docker and Docker Compose installed
  • Domain name configured with DNS access
  • Basic Linux command line knowledge

Step 1: Create Proxy Service (server.js)

This Node.js server provides a secure /proxy endpoint, enforces an API key check, and intelligently handles target API responses (JSON vs. raw text) with a 30-second timeout.

// server.js - Secure Node.js Proxy Server
const express = require('express');
const cors = require('cors');

const app = express();
const PORT = process.env.PORT || 8765;

app.use(cors());
app.use(express.json({ limit: '5mb' })); // Allow up to 5MB request body

// Health check endpoint
app.get('/health', (req, res) => {
    res.json({ status: 'ok', timestamp: new Date().toISOString() });
});

// Main proxy endpoint
app.post('/proxy', async (req, res) => {
    // CRITICAL SECURITY CHECK: Validate API Key from the client header
    const expectedSecret = process.env.PROXY_SECRET;
    const clientKey = req.headers['x-api-key'];

    if (expectedSecret && clientKey !== expectedSecret) {
        // Return 401 if the key is invalid or missing
        return res.status(401).json({ error: 'Unauthorized', message: 'Missing or invalid X-API-Key header.' });
    }
    
    try {
        const { url, method = 'GET', headers = {}, body } = req.body;
        
        // Timeout configuration (30 seconds)
        const controller = new AbortController();
        const timeoutId = setTimeout(() => controller.abort(), 30000);

        const response = await fetch(url, {
            method,
            headers: {
                // Ensure a standard User-Agent is sent to the target API
                'User-Agent': 'Mozilla/5.0 (compatible; CloudProxy/1.0)',
                ...headers
            },
            // Serialize body for methods that typically carry a payload
            body: method !== 'GET' && method !== 'HEAD' ? JSON.stringify(body) : undefined,
            signal: controller.signal // Apply the timeout signal
        });
        
        clearTimeout(timeoutId);

        // Intelligent parsing based on Content-Type header
        const contentType = response.headers.get('content-type');
        let data;

        if (contentType && contentType.includes('application/json')) {
            try {
                data = await response.json();
            } catch (e) {
                // Fallback to text if JSON parsing fails
                data = await response.text();
            }
        } else {
            // For non-JSON responses (HTML, XML, raw text)
            data = await response.text();
        }

        // Return the structured response containing the target API details
        res.status(200).json({ 
            status: response.status, // The actual HTTP status from the target API
            statusText: response.statusText,
            data: data,
            headers: Object.fromEntries(response.headers.entries()),
            isError: !response.ok // Flag if the target API returned an error status (4xx, 5xx)
        });

    } catch (error) {
        console.error('Proxy internal execution error:', error.message);
        // Handle timeout (AbortError) and other internal proxy errors
        res.status(500).json({
            error: 'Proxy execution failed',
            message: error.message
        });
    }
});

app.listen(PORT, () => {
    console.log(`Cloud proxy server running on port ${PORT}`);
});

Step 2: Docker Configuration

package.json (Dependencies)

// package.json - Dependency file
{
  "name": "cloud-proxy",
  "version": "1.0.0",
  "main": "server.js",
  "dependencies": {
    "express": "^4.18.2",
    "cors": "^2.8.5"
  },
  "scripts": {
    "start": "node server.js"
  }
}

Dockerfile (Image Build)

# Dockerfile - Used to build the Node.js service container image
FROM node:18-alpine

WORKDIR /app

COPY package*.json ./
# Install only production dependencies for efficiency
RUN npm ci --only=production

COPY . .

EXPOSE 8765

CMD ["node", "server.js"]

docker-compose.yml (Deployment Configuration)

IMPORTANT: You must replace YOUR_VERY_STRONG_API_KEY_HERE with a complex, unique secret key.

# docker-compose.yml - Docker Compose deployment file
version: '3.8'

services:
  cloud-proxy:
    build: .
    ports:
      - "8765:8765" # Map host port to container port
    environment:
      - NODE_ENV=production
      - PORT=8765
       # CRITICAL: Define the secret key used for client authentication
      - PROXY_SECRET=YOUR_VERY_STRONG_API_KEY_HERE 
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:8765/health"]
      interval: 30s
      timeout: 10s
      retries: 3

Step 3: Deploy the Service

Run these commands in the directory containing the four configuration files above:

# Command Line - Service Deployment
# Create project directory (if starting fresh)
mkdir cloud-proxy && cd cloud-proxy

# Place server.js, package.json, Dockerfile, docker-compose.yml here

# Build and start the service (using -d for detached mode)
docker-compose up -d

# Check container status
docker-compose ps
# View real-time logs
docker-compose logs -f

4. Security Hardening (CRITICAL)

The proxy service must be protected to prevent unauthorized use and abuse.

Action: Implement API Key Protection

  1. Generate Key: Create a robust, unique API key.
  2. Configure Key: Set this key as the PROXY_SECRET in your docker-compose.yml.
  3. Server Verification: The server.js file enforces that the client must provide this key in the X-API-Key header.

5. Configure Domain and SSL (Nginx)

Set up a reverse proxy using Nginx to handle SSL termination and route traffic to the Docker container.

# Nginx Configuration Example - /etc/nginx/sites-available/proxy.yourdomain.com
server {
    listen 80;
    server_name proxy.yourdomain.com;
    
    location / {
        proxy_pass http://localhost:8765;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        
        # Increase timeouts for API requests
        proxy_connect_timeout 60s;
        proxy_send_timeout 60s;
        proxy_read_timeout 60s;
    }
}

Enable SSL using Let's Encrypt and Certbot:

# Command Line - SSL Configuration
# Install certbot
sudo apt install certbot python3-certbot-nginx

# Get SSL certificate
sudo certbot --nginx -d proxy.yourdomain.com

# Enable site configuration and reload Nginx
sudo ln -s /etc/nginx/sites-available/proxy.yourdomain.com /etc/nginx/sites-enabled/
sudo nginx -t && sudo systemctl reload nginx

6. Client Integration (cloud-proxy.client.js)

The client utility must send the required secret key and handle the structured JSON response returned by the proxy server.

// cloud-proxy.client.js - Client Utility Function for Cloud Proxy
const CLOUD_PROXY_ENDPOINT = '[https://proxy.yourdomain.com/proxy](https://proxy.yourdomain.com/proxy)';
// IMPORTANT: This key must be securely stored in your application's environment
const PROXY_SECRET = 'YOUR_VERY_STRONG_API_KEY_HERE'; 

async function fetchViaCloudProxy(url, options = {}) {
  const controller = new AbortController();
  // Client-side timeout (shorter than server-side 30s timeout)
  const timeoutId = setTimeout(() => controller.abort(), 25000); 
  
  try {
    const proxyResponse = await fetch(CLOUD_PROXY_ENDPOINT, {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'X-API-Key': PROXY_SECRET, // 🔑 CRITICAL: Send the secret key for authentication
        'User-Agent': 'Mozilla/5.0 (compatible; AppClient/1.0)',
      },
      body: JSON.stringify({
        url,
        method: options.method || 'GET',
        headers: options.headers || {},
        body: options.body
      }),
      signal: controller.signal
    });
    
    clearTimeout(timeoutId);

    // 1. Handle Proxy Service Errors (e.g., 401 Unauthorized, 500 Internal Proxy Error)
    if (!proxyResponse.ok) {
        const errorBody = await proxyResponse.json();
        throw new Error(`Proxy Service Error (${proxyResponse.status}): ${errorBody.message || 'Unknown proxy error'}`);
    }

    // 2. Extract and Process Target API Response from Proxy payload
    const proxyPayload = await proxyResponse.json();
    
    // If the target API returned an error status (e.g., 400, 404, 503)
    if (proxyPayload.isError) {
        console.error(`Target API Error (${proxyPayload.status}):`, proxyPayload.data);
        const error = new Error(`API call failed with status ${proxyPayload.status}`);
        error.status = proxyPayload.status;
        error.data = proxyPayload.data;
        error.headers = proxyPayload.headers;
        throw error;
    }

    return proxyPayload.data; // Return the successfully fetched data

  } catch (error) {
    clearTimeout(timeoutId);
    throw error;
  }
}

// Smart fetch with fallback to direct request if proxy fails
async function smartFetch(url, options = {}) {
  try {
    // Try cloud proxy first
    return await fetchViaCloudProxy(url, options);
  } catch (error) {
    console.warn('Cloud proxy failed, attempting direct request fallback:', error.message);
    
    // Fallback to direct request (may fail due to network restrictions)
    return await fetch(url, options);
  }
}

7. Environment Configuration

Update your application's environment variables to use the new proxy:

# .env
USE_CLOUD_PROXY=true
CLOUD_PROXY_ENDPOINT=[https://proxy.yourdomain.com/proxy](https://proxy.yourdomain.com/proxy)
CLOUD_PROXY_SECRET=YOUR_VERY_STRONG_API_KEY_HERE

Key Benefits

  1. Reliability: Dedicated server with a static IP address for whitelisting.
  2. Performance: Optimized for long API requests with 30-second timeouts.
  3. Control: Full control over proxy configuration, logging, and environment updates.
  4. Cost-Effective: Predictable, fixed monthly server cost vs. serverless per-request pricing.
  5. Security: Protected by a mandatory, secret API key (X-API-Key header).
  6. Geographic Optimization: Deploy in regions closest to target APIs for optimal latency.

Frequently Asked Questions

Q1: Why can't I simply use Vercel or other Serverless platforms to host this proxy?
Q2: Why is Nginx required as a reverse proxy?
Q3: What are the main benefits of a self-hosted proxy compared to a local development tunnel (e.g., Ngrok)?
Q4: What is the most critical security measure for this proxy service?
Q5: Why is the server-side timeout set to 30 seconds in server.js?
Related Blogs
Complete Guide to Google Merchant API Authorization for Shopify Apps logo Image
Dev Log
Shopify App Dev
Complete Guide to Google Merchant API Authorization for Shopify Apps
How to Self‑Host a Stable Shopify App Dev Tunnel with FRP + Cloudflared (Fixed Port Mode) logo Image
Dev Log
Shopify App Dev
How to Self‑Host a Stable Shopify App Dev Tunnel with FRP + Cloudflared (Fixed Port Mode)
How to Deploy a Shopify App to VPS Using Dokploy: Complete Guide logo Image
Dev Log
Shopify App Dev
How to Deploy a Shopify App to VPS Using Dokploy: Complete Guide
How to Deploy FRP on Dokploy Platform logo Image
Dev Log
Shopify App Dev
How to Deploy FRP on Dokploy Platform
Configuring Fixed Port for Shopify App Dev Tunnel (Vite Modification) logo Image
Dev Log
Shopify App Dev
Configuring Fixed Port for Shopify App Dev Tunnel (Vite Modification)
Shopify App Dev Localhost Error: Invalid Webhook URI Fix logo Image
Dev Log
Dev Log
Shopify App Dev Localhost Error: Invalid Webhook URI Fix
How I Built an E-commerce Toolkit with Zero Coding Experience - Dev Log #001 logo Image
Dev Log
Dev Log
How I Built an E-commerce Toolkit with Zero Coding Experience - Dev Log #001