OpenAI 连接器 #

概述 #

OpenAI 连接器是 Semantic Kernel 与 OpenAI API 通信的桥梁,支持 GPT-4、GPT-3.5 等模型的聊天补全和文本嵌入功能。

基本配置 #

添加 OpenAI 服务 #

csharp
using Microsoft.SemanticKernel;

var builder = Kernel.CreateBuilder();

builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: Environment.GetEnvironmentVariable("OPENAI_API_KEY")
);

var kernel = builder.Build();

配置选项 #

csharp
builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: "your-api-key",
    orgId: "your-org-id",           // 可选:组织 ID
    serviceId: "openai-gpt4"        // 可选:服务标识
);

支持的模型 #

聊天模型 #

模型 描述 推荐场景
gpt-4o 最新旗舰模型 复杂任务、多模态
gpt-4-turbo 高性能 GPT-4 高质量输出
gpt-4 标准 GPT-4 复杂推理
gpt-3.5-turbo 快速经济模型 简单任务、高并发

嵌入模型 #

模型 维度 描述
text-embedding-3-small 1536 高性价比
text-embedding-3-large 3072 最高质量
text-embedding-ada-002 1536 旧版模型

使用嵌入模型 #

csharp
builder.AddOpenAITextEmbeddingGeneration(
    modelId: "text-embedding-3-small",
    apiKey: "your-api-key"
);

高级配置 #

自定义 HttpClient #

csharp
var handler = new HttpClientHandler
{
    Proxy = new WebProxy("http://proxy:8080"),
    UseProxy = true
};

var httpClient = new HttpClient(handler)
{
    Timeout = TimeSpan.FromMinutes(5)
};

builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: "api-key",
    httpClient: httpClient
);

配置基础 URL #

csharp
builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: "api-key",
    endpoint: new Uri("https://api.openai.com/v1")
);

使用自定义域名 #

csharp
// 使用代理或自定义端点
builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: "api-key",
    endpoint: new Uri("https://your-proxy.com/v1")
);

执行设置 #

基本参数 #

csharp
using Microsoft.SemanticKernel.Connectors.OpenAI;

var settings = new OpenAIPromptExecutionSettings
{
    MaxTokens = 1000,
    Temperature = 0.7,
    TopP = 0.9,
    FrequencyPenalty = 0.5,
    PresencePenalty = 0.5
};

var result = await kernel.InvokePromptAsync(
    "写一首诗",
    new KernelArguments(settings)
);

完整参数配置 #

csharp
var settings = new OpenAIPromptExecutionSettings
{
    MaxTokens = 2000,
    Temperature = 0.8,
    TopP = 0.95,
    FrequencyPenalty = 0.3,
    PresencePenalty = 0.3,
    StopSequences = new[] { "###", "END", "\n\n" },
    ResultsPerPrompt = 1,
    Seed = 42,
    User = "user-123",
    Logprobs = true,
    TopLogprobs = 5
};

参数说明 #

参数 范围 描述
Temperature 0-2 控制随机性,越高越随机
TopP 0-1 核采样,控制多样性
MaxTokens 1-模型限制 最大输出 token 数
FrequencyPenalty -2-2 频率惩罚,减少重复
PresencePenalty -2-2 存在惩罚,鼓励新话题
Seed 整数 随机种子,可复现结果

多服务配置 #

配置多个模型 #

csharp
var builder = Kernel.CreateBuilder();

builder.AddOpenAIChatCompletion(
    serviceId: "gpt4",
    modelId: "gpt-4",
    apiKey: "api-key"
);

builder.AddOpenAIChatCompletion(
    serviceId: "gpt35",
    modelId: "gpt-3.5-turbo",
    apiKey: "api-key"
);

var kernel = builder.Build();

// 使用特定服务
var result1 = await kernel.InvokePromptAsync(
    "复杂问题",
    serviceName: "gpt4"
);

var result2 = await kernel.InvokePromptAsync(
    "简单问题",
    serviceName: "gpt35"
);

动态选择服务 #

csharp
public async Task<string> GetCompletionAsync(
    Kernel kernel,
    string prompt,
    bool useAdvancedModel = false)
{
    var serviceName = useAdvancedModel ? "gpt4" : "gpt35";
    
    return await kernel.InvokePromptAsync(
        prompt,
        serviceName: serviceName
    );
}

流式输出 #

基本流式 #

csharp
await foreach (var chunk in kernel.InvokePromptStreamingAsync("写一首诗"))
{
    Console.Write(chunk);
}

处理流式内容 #

csharp
await foreach (var content in kernel.InvokePromptStreamingAsync<StreamingChatMessageContent>("写故事"))
{
    if (content.Content != null)
    {
        Console.Write(content.Content);
    }
    
    if (content.FunctionCall != null)
    {
        Console.WriteLine($"\n函数调用: {content.FunctionCall.Name}");
    }
}

函数调用 #

启用函数调用 #

csharp
var settings = new OpenAIPromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

var result = await kernel.InvokePromptAsync(
    "今天北京天气怎么样?",
    new KernelArguments(settings)
);

配置可用函数 #

csharp
kernel.Plugins.AddFromType<WeatherPlugin>("Weather");

var settings = new OpenAIPromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(
        options: new FunctionChoiceBehaviorOptions
        {
            AllowConcurrentInvocation = true,
            AllowParallelCalls = true
        }
    )
};

错误处理 #

常见错误 #

csharp
try
{
    var result = await kernel.InvokePromptAsync("问题");
}
catch (HttpOperationException ex)
{
    switch (ex.StatusCode)
    {
        case System.Net.HttpStatusCode.Unauthorized:
            Console.WriteLine("API Key 无效");
            break;
        case System.Net.HttpStatusCode.TooManyRequests:
            Console.WriteLine("请求过于频繁");
            break;
        case System.Net.HttpStatusCode.InternalServerError:
            Console.WriteLine("服务器错误");
            break;
        default:
            Console.WriteLine($"HTTP 错误: {ex.StatusCode}");
            break;
    }
}
catch (KernelException ex)
{
    Console.WriteLine($"Kernel 错误: {ex.Message}");
}

重试策略 #

csharp
using Microsoft.Extensions.Http.Resilience;

var httpClient = new HttpClient(
    new ResilienceHandler(
        new HttpRetryStrategyOptions
        {
            MaxRetryAttempts = 3,
            BackoffType = DelayBackoffType.Exponential,
            UseJitter = true
        }
    )
    {
        InnerHandler = new HttpClientHandler()
    }
);

builder.AddOpenAIChatCompletion(
    modelId: "gpt-4",
    apiKey: "api-key",
    httpClient: httpClient
);

最佳实践 #

1. 安全存储 API Key #

csharp
// 使用环境变量
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");

// 使用配置文件
var apiKey = configuration["OpenAI:ApiKey"];

// 使用 Azure Key Vault(生产环境)
var secretClient = new SecretClient(vaultUri, credential);
var secret = await secretClient.GetSecretAsync("OpenAI-ApiKey");

2. 合理设置超时 #

csharp
var httpClient = new HttpClient
{
    Timeout = TimeSpan.FromSeconds(60)  // 根据实际情况调整
};

3. 监控 Token 使用 #

csharp
var result = await kernel.InvokePromptAsync("问题");

var usage = result.Metadata?["Usage"] as Dictionary<string, object>;
if (usage != null)
{
    _logger.LogInformation(
        "Token 使用: Prompt={Prompt}, Completion={Completion}, Total={Total}",
        usage["PromptTokens"],
        usage["CompletionTokens"],
        usage["TotalTokens"]
    );
}

4. 成本优化 #

csharp
// 简单任务使用小模型
public async Task<string> GetCompletionAsync(string prompt, TaskComplexity complexity)
{
    var modelId = complexity switch
    {
        TaskComplexity.Simple => "gpt-3.5-turbo",
        TaskComplexity.Medium => "gpt-4-turbo",
        TaskComplexity.Complex => "gpt-4",
        _ => "gpt-3.5-turbo"
    };
    
    return await kernel.InvokePromptAsync(prompt, serviceName: modelId);
}

下一步 #

现在你已经掌握了 OpenAI 连接器,接下来学习 Azure OpenAI 连接器,了解企业级部署方案!

最后更新:2026-04-04