Cloudflare · Schema

ChatCompletionRequest

AI GatewayAPI GatewayArtificial IntelligenceCDNCloudContainersDDoS ProtectionDNSEdgeEdge ComputingObject StoragePlatformReal-Time CommunicationSecurityServerlessWeb Performance

Properties

Name Type Description
model string The model identifier, e.g. @cf/meta/llama-3.1-8b-instruct.
messages array
max_tokens integer Maximum number of tokens to generate.
temperature number Sampling temperature between 0 and 2.
top_p number Nucleus sampling parameter.
stream boolean Whether to stream the response via server-sent events.
View JSON Schema on GitHub