Hacker News new | past | comments | ask | show | jobs | submit login

This is trivially achievable with function calling, assuming the model you use supports this (which most models do at this point).

Define a function `reportFactual(isFactual: boolean)` and you will get standardized, machine-readable answers to do statistics with.




Simpler yet, just tell the model "Reply with 'Yes' or 'No'."


I’ve used function calls with OpenAI. But are there any good local LLMs that you can run with Ollama that support function calling?


If you expect an OpenAI compatible API to use function calls, I don't think Ollama supports it yet (to be confirmed). However you can do it yourself using the appropriate tokens for the model. I know that Llama3, various Mistrals and Command-R support function calling out of the box.

Here are the tokens to achieve this in Mixtral 8x22 https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1...

Pass function definitions in the system prompt.


I think llamafile supports openai compatible api..

https://github.com/Mozilla-Ocho/llamafile




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: