摘要:Phi4-mini开始支持tools了,但在第一时间试用时不理想,kenfey最近发了一篇解决方案,详见https://techcommunity.microsoft.com/blog/educatordeveloperblog/building-ai-age
Phi4-mini开始支持tools了,但在第一时间试用时不理想,kenfey最近发了一篇解决方案,详见https://techcommunity.microsoft.com/blog/educatordeveloperblog/building-ai-agents-on-edge-devices-using-ollama--phi-4-mini-function-calling/4391029。本篇文章想更详细地梳理一下,结合SK来给出一个完整的例子。
首先安装olloma,然后安装phi4-mini,命令如下:
ollama pull phi4-mini
原生成phi4-mini对tools支持有有问题的,这里需在替换一下modelfile,可以先用下面命令查看一下原来的modelfile,以便获取FROM地的址信息:
ollama show phi4-mini --modelfile
找一个目录把下面修正后的Modefile存起来,Modefile文件没有扩展名。
FROM C:\Users\axzxs\.ollama\models\blobs\sha256-3c168af1dea0a414299c7d9077e100ac763370e5a98b3c53801a958a47f0a5dbTEMPLATE """{{- if or .System .Tools }}{{ if .System }}{{ .System }}{{ end }}{{- if .Tools }}{{- if not .System }}You may call one or more functions to assist with the user query. You are provided with Function signatures.{{- end }}[{{- range .Tools }}{{ .Function }}{{ end }}]{{- end }}{{- end }}{{- range $i, $_ := .Messages }}{{- $last := eq (len (slice $.Messages $i)) 1 -}}{{- if ne .Role "system" }}{{ .Content }}{{- if .ToolCalls }}[{{ range .ToolCalls }}{"name":"{{ .Function.Name }}","arguments":{{ .Function.Arguments }}}{{ end }}]{{- end }}{{- if not $last }}{{- end }}{{- if and (ne .Role "assistant") $last }}{{ end }}{{- end }}{{- end }}"""然后用下面命令重写Modelfile:
ollama create phi4-mini -f "你的Modelfile路径"
SK调用Phi4-mini tools的代码如下:
using Microsoft.Extensions.DependencyInjection;using Microsoft.Extensions.Logging;using Microsoft.SemanticKernel;using Microsoft.SemanticKernel.ChatCompletion;using Microsoft.Semantickernel.Connectors.Ollama;using Microsoft.SemanticKernel.Connectors.OpenAI;using OllamaSharp;using OllamaSharp.Models;using OpenAI.RealtimeConversation;using System;using System.ComponentModel;using System.ComponentModel.DataAnnotations;#pragma warning disableawait Call;Console.ReadLine;async Task Call{ var builder = Kernel.CreateBuilder; var modelId = "phi4-mini"; var endpoint = new Uri("http://localhost:11434"); builder.Services.AddOllamaChatCompletion(modelId, endpoint); builder.Plugins.AddFromType; var kernel = builder.Build; var chatCompletionService = kernel.GetRequiredService; var settings = new OllamaPromptExecutionSettings { FunctionChoiceBehavior = FunctionChoiceBehavior.None }; Console.Write(">>> "); string? input = "获取订单编号为SN0000111的订单总额?"; Console.WriteLine(input); try { ChatMessageContent chatResult = await chatCompletionService.GetChatMessageContentAsync(input, settings, kernel); Console.Write($"\n>>> Result: {chatResult}\n\n> "); } catch (Exception ex) { Console.WriteLine($"Error: {ex.Message}\n\n> "); }}public class OrderPlugin{ [KernelFunction] [Description("获取订单总额")] public decimal GetOrderAmount([Description("订单编号")] string orderNo) { Console.WriteLine($"订单编号:{orderNo},订单:12345.67"); return 12345.67m; }}从试用的结果来看,并不是每次都是理想的,调用tools成功率还不理想,期望官方能更新,进一步提升调成率。
来源:opendotnet
免责声明:本站系转载,并不代表本网赞同其观点和对其真实性负责。如涉及作品内容、版权和其它问题,请在30日内与本站联系,我们将在第一时间删除内容!