NousResearch/Hermes-2-Pro-Mistral-7B
7B
4,154 Pulls Updated 6 months ago
Updated 6 months ago
6 months ago
be0ad79940b4 · 3.2GB
model
archllama
·
parameters7.24B
·
quantizationQ3_K_S
3.2GB
params
{"stop":["<|im_start|>","<|im_end|>"]}
59B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
156B
system
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.
You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.
Use the following json schema for each tool call you will make: {"title": "FunctionCall", "type": "object", "properties": {"arguments": {"title": "Arguments", "type": "object"}, "name": {"title": "Name", "type": "string"}}, "required": ["arguments", "name"]}
For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{"arguments": <args-dict>, "name": <function-name>}
</tool_call>
707B
license
Apache License 2.0
20B
Readme
github.com/adrienbrault/ollama-nous-hermes2pro
Ollama models of NousResearch/Hermes-2-Pro-Mistral-7B-GGUF.
$ ollama run adrienbrault/nous-hermes2pro:Q4_0 'Hey!'
Hello! How can I help you today? If you have any questions or need assistance, feel free to ask.
There are -tools
and -json
tags with the recommended system prompt for function calling and json mode.
You provide the tools with the user message:
$ ollama run adrienbrault/nous-hermes2pro:Q4_0-tools "<tools>$(cat examples/tool-stock.json)</tools>
Fetch the stock fundamentals data for Tesla (TSLA)"
<tool_call>
{"arguments": {"symbol": "TSLA"}, "name": "get_stock_fundamentals"}
</tool_call>
Or a schema for the json mode:
$ ollama run adrienbrault/nous-hermes2pro:Q4_0-json "<schema>$(cat examples/user-schema.json)<schema>
Adrien Brault was born in 1991"
{"firstName": "Adrien", "lastName": "Brault", "age": 30}