If you expect an OpenAI compatible API to use function calls, I don't think Ollama supports it yet (to be confirmed). However you can do it yourself using the appropriate tokens for the model. I know that Llama3, various Mistrals and Command-R support function calling out of the box.
Define a function `reportFactual(isFactual: boolean)` and you will get standardized, machine-readable answers to do statistics with.