網(wǎng)站后臺(tái)logo網(wǎng)站推廣優(yōu)化怎么做最好
C#與ai的交互
與AI的交互使用的Http請(qǐng)求的方式,通過發(fā)送請(qǐng)求,服務(wù)器響應(yīng)ai生成的文本
下面是完整的代碼,我這里使用的是Ollama本地部署的deepseek,在聯(lián)網(wǎng)調(diào)用api時(shí),則url會(huì)有不同
public class OllamaRequester
{[Serializable]public class RequestData{public string model; //模型名稱public string prompt; //對(duì)話文本public int[] context; //上下文public bool stream; //是否使用流式傳輸}[Serializable]public class ResponseData{public string model;public string created_at;public string response; //相應(yīng)內(nèi)容public bool done; //生成是否結(jié)束public string done_reason; //結(jié)束的狀態(tài)public int[] context; //上下文public long total_duration;public long load_duration;}private static OllamaRequester instance;public static OllamaRequester Instance{get{if(instance == null){instance = new OllamaRequester();instance.Init();}return instance;}}private int[] context;private HttpClient client;private void Init(){client = new HttpClient();}public async Task SendReq(string str, Action<ResponseData> onResOnce){//注意!這里是本地Ollama的地址,如果你是聯(lián)網(wǎng)調(diào)用ai接口的的話,需要改成官網(wǎng)提供的urlstring url = "http://localhost:11434/api/generate"; //ollama端口默認(rèn)11434//如果使用聯(lián)網(wǎng)調(diào)用ai接口,則請(qǐng)求的參數(shù)會(huì)有不同,我這使用的是本地部署的請(qǐng)求參數(shù)RequestData data = new RequestData(){model = "deepseek-r1:7b",prompt = str,context = context,stream = true, //建議用流式傳輸,不然響應(yīng)比較慢};string json = JsonUtility.ToJson(data);HttpContent content = new StringContent(json);content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/json");Debug.Log("發(fā)送請(qǐng)求..");try{var request = new HttpRequestMessage(HttpMethod.Post, url);request.Content = content;//這個(gè)HttpCompletionOption.ResponseHeadersRead至關(guān)重要,流式傳輸必須使用這個(gè)HttpResponseMessage msg = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);try{//不是200則直接報(bào)錯(cuò)if (msg.StatusCode != System.Net.HttpStatusCode.OK){Debug.LogError($"錯(cuò)誤!statusCode=={msg.StatusCode}, 錯(cuò)誤消息=={msg.Content}");return;}Stream stream = await msg.Content.ReadAsStreamAsync();StreamReader reader = new StreamReader(stream);while (true){string resStr = await reader.ReadLineAsync();Debug.Log("str==" + resStr);ResponseData res = JsonUtility.FromJson<ResponseData>(resStr);onResOnce?.Invoke(res);if (res.done){break;}}reader.Dispose();stream.Dispose();}catch (Exception e){Debug.LogError(e);}}catch(Exception e){Debug.LogError(e);}}
}
注意代碼中SendAynsc時(shí),使用了HttpCompletionOption.ResponseHeadersRead,這個(gè)枚舉表示的是讀取響應(yīng)頭部信息,并且允許你從響應(yīng)流中逐步讀取信息。
默認(rèn)情況下,Http會(huì)等待整個(gè)響應(yīng)體全部下載完,才會(huì)返回響應(yīng),這樣在文本很長(zhǎng)時(shí)響應(yīng)會(huì)非常慢!所以通常建議開啟流式傳輸
下面是ai響應(yīng)的數(shù)據(jù)格式示例
--api返回的數(shù)據(jù)格式是json(因?yàn)閏sdn沒有json格式的代碼段,所以我用了lua表示)
{"model": "deepseek-r1:7b", --模型"created_at": "2025-02-24T02:03:41.8641806Z","response": "", --響應(yīng)內(nèi)容,我這里因?yàn)橐呀?jīng)結(jié)束生成了,所以resposne是空"done": true, --done==true,表示結(jié)束生成"done_reason": "stop", --done_reason==stop,表示正常結(jié)束,會(huì)有其他非正常結(jié)束的情況"context": [ --上下文,在下次發(fā)送請(qǐng)求的時(shí)候,需要發(fā)送context,可以讓ai保持連續(xù)對(duì)話151644,108386,151645,151648,271,151649,271,108386,6313,112169,104639,56568,3837,104139,109944,106128,9370,101037,11319,102215,86119,5373,101898,99998,100836,100281,3837,35946,102070,108896,101036,6313,144236],"total_duration": 4234560000,"load_duration": 18329400,"prompt_eval_count": 4,"prompt_eval_duration": 274000000,"eval_count": 31,"eval_duration": 3941000000
}