Page is under construction.
Feedback Welcome!
Feedback Welcome!
Gateway Modes
You can run a gateway- “Off-chain” -> dev or local mode
- “On-chain” -> production mode connected to the blockchain-based Livepeer network.
There is currently no Livepeer “Testnet” available which has Orchestrator offerings, though there are conversations underway to enable this in the future.Do you think Livepeer should have a “testnet” available for Gateways to connect to?Follow & contribute to the discussion in the Discord and on the Forum)
Deploy a Gateway for AI Inference Services
You can run the Livepeer AI software using one of the following methods:- Docker (Recommended): The simplest and preferred method.
- Pre-built Binaries: An alternative if you prefer not to use Docker.
Deploy an AI Gateway
Follow the steps below to start your Livepeer AI Gateway node. These instructions apply to both on-chain & off-chain Gateway deployments.
- Use Docker (Recommended)
- Use Binaries
1
Retrieve the Livepeer AI Docker Image
Fetch the latest Livepeer AI Docker image with the following command:
Pull Docker Image
2
Launch a Localhost (Off-chain) AI Gateway
Run the Docker container for your AI Gateway node:This command launches a local (off-chain) AI Gateway node. The flags are similar to those used for a Mainnet Transcoding Network Gateway.
(fixme) See the go-livepeer CLI reference for details.
3
Confirm Successful Startup
Upon successful startup, you should see output similar to:
4
Check Port Availability
Ensure that port
8937 is open and accessible, and configure your router for port forwarding if necessary to make the Gateway accessible from the internet.
Add Howto
5
Test the AI Gateway
Follow the instructions in the AI Gateway Testing Guide to verify that the AI Gateway is operational.
Gateway Code Links
For AI processing, the Gateway extends its functionality to handle AI-specific workflows.go-livepeer/server/ai_mediaserver.go
- AISessionManager: Manages AI processing sessions and selects appropriate Orchestrators with AI capabilities ai_http.go
- MediaMTX Integration: Handles media streaming for AI processing
- Trickle Protocol: Enables efficient streaming for real-time AI video processing
- authenticating AI streams,
- selecting AI-capable Orchestrators,
- processing payments based on pixels, and
- managing live AI pipelines