Skip to content

Customization

You can modify the following variables in src/settings.py or src/ollama_client.py:

VariableDescription
OLLAMA_MODELName of the model to use (llama3, mistral, phi3, etc.)
OLLAMA_HOSTURL to your running Ollama server
SYSTEM_PROMPTDefines Al’s personality and tone
MEMORY_FILEPath to Al’s message history file (default: memory.jsonl)

Example: Custom SYSTEM_PROMPT

python
SYSTEM_PROMPT = "You are Al, a helpful assistant that specializes in ancient history. You always respond with a historical fact."