Skip to content

Configuration

The MCPurify service binary mcpurify-filtering-service accepts an external configuration as TOML format. Below is an example configuration, that spawns an interactive service that will launch an interactive proxy at a localhost and port 3000.

# This config file controls how `mcpurify-filtering-service` will be executed.
repl = false
timeout = 1000
# A List of AI Providers to be used as third-party filters
[[aiprovider]]
name = "Ollama"
endpoint = "http://localhost:11434"
model = "gemma:2b"
system_prompt = "You are a helpful assistant. Reply to every prompt just with a JSON in one line."
[[aiprovider]]
name = "Anthropic"
endpoint = "https://api.anthropic.com/v1/messages"
# check latest version, if needed here: https://docs.anthropic.com/en/api/versioning
anthropic-version = "2023-06-01"
# Ensure that the API key will not be stored in plaintext here, but must be supplied
# by a provider.
key = "keyring:PATH_TO_CLAUDE_API_KEY"
[proxy]
downstream = "http://compute.local:11434"
listenaddr = "127.0.0.1:3000"
config keydescription
replStarts an interactive, REPL like session that reads from stdin and returns a response via stdout
timeoutSets the global timeout for any filter to respond to mitigate latency.
aiproviderDeclares a block to make use of an external AI provider
aiprovider.nameSets the name of AI provider. The name will be used to identify available clients.
aiprovider.endpointSets the AI provider API endpoint.
aiprovider.modelSets the model to use by the AI provider. Currently this is only useful for Ollama
aiprovider.*Depending on the AI provider client additional key-value pairs can be defined.
proxyDeclares a block to configure the forwarding proxy
proxy.downstreamSets the downstream backend url of an ai service provide like Ollama, Claude etc.
proxy.listenaddrSets the local listening address and port

Note

MCPurify can be configured with multiple AI providers, as they are used to filter MCP service requests. The proxy configuration defines the worker model responsible for receiving prompts and initiating tool invocations. Currently, only one proxy can be configured as the filtering gateway to forward all prompts.