中文

Community Integrations

Grok is also accessible via your favorite community integrations, including LangChain and LiteLLM.grok.cadn.net.cn


LiteLLM provides a simple SDK or proxy server for calling different LLM providers. If you're using LiteLLM, integrating xAI as your provider is straightforward—just swap out the model name and API key to xAI's Grok model in your configuration.grok.cadn.net.cn

For latest information and more examples, visit LiteLLM xAI Provider Documentation.grok.cadn.net.cn

As a quick start, you can use LiteLLM in the following fashion:grok.cadn.net.cn

from litellm import completion
import os

os.environ['XAI_API_KEY'] = ""
response = completion(
    model="xai/grok-2-latest",
    messages=[
        {
            "role": "user",
            "content": "What's the weather like in Boston today in Fahrenheit?",
        }
    ],
    max_tokens=10,
    response_format={ "type": "json_object" },
    seed=123,
    stop=["

"],
    temperature=0.2,
    top_p=0.9,
    tool_choice="auto",
    tools=[],
    user="user",
)
print(response)

You can use Continue extension in VSCode or JetBrains with xAI's models.grok.cadn.net.cn

To start using xAI models with Continue, you can add the following in Continue's config file ~/.continue/config.json(MacOS and Linux)/%USERPROFILE%\.continue\config.json(Windows).grok.cadn.net.cn

 "models": [
   {
     "title": "Grok-2",
     "provider": "xAI",
     "model": "grok-2-latest",
     "apiKey": "[XAI_API_KEY]"
   }
 ]

Visit Continue's Documentation for more details.grok.cadn.net.cn