Project configuration

Definition

Using WedoLow MCP Server requires a configuration for your project, so the tool knows:

  • How to build your project

  • How to test your project (optional)

  • How to bench your project (optional)

  • What is your target platform

  • What is the top function of your project for the optimization

  • What optimization techniques you wish to enable/disable (avanced usage)

This configuration should be written in file wedolow_mcp_project_config.json at the project root.

Here is a basic example:

wedolow_mcp_project_config.json
{
    "build_cmd": {
        "cmd": [
            "make"
        ],
        "run_dir": null
    },
    "clean_cmd": {
        "cmd": [
            "make",
            "clean"
        ],
        "run_dir": null
    },
    "test_cmd": {
        "cmd": [
            "make",
            "test"
        ],
        "run_dir": null
    },
    "benchmark_cmd": {
        "cmd": [
            "make",
            "bench"
        ],
        "run_dir": null
    },
    "top_function": {
        "name": "FFT",
        "file": "src/fft.cpp"
    },
    "target_platform": "native"
}

You may create this file manually, or rely on section Generate your configuration so your AI agent does it itself.

Supported target platforms

To know the list of supported target platforms, you may simply ask your AI agent:

Show me the list of target platforms supported by WedoLow MCP Server.

The AI agent will make the right call to give you the answer.

Asking Gemini CLI the list of supported target platforms

If your AI agent supports MCP resources, you may also find this information in the list of resources.

Generate your configuration

The simplest way to configure your project is to ask help to your AI agent:

Generate wedolow_mcp_project_config.json config file for WedoLow MCP server using the following informations. My project compiles for the native target platform. It is built with command "make", cleaned with command "make clean" and tested with command "make test", benched with command "make bench". Using "make bench" command. The target platform is the native target. My top function is FFT function in src/fft.cpp file.

You know have a correct configuration file at the root of your project.

Asking Gemini CLI to generate a configuration file for your project

Advanced usage: deactivate optimizations

If you wish to control which optimizations you want your AI agent to search for and apply, you may blacklist specific optimization techniques. To do that, a new section must be added in the configuration file.

{
    "optim_techniques": {
      "<optim-technique-name-1>": true,
      "<optim-technique-name-2>": false,
      "<optim-technique-name-3>": false
    }
}

If not defined here, optimization techniques default to true (activated).

If you want to control this yourself, ask your AI agent to add this section:

In my wedolow_mcp_project_config.json, add section "optim_techniques" and list all the available optimization techniques there. Default them all to true.

Asking Gemini CLI to add the explicit list of available optimization techniques

Your project is fully configured. You may now run the optimization.

When optimizing C++, the top function name must carry its namespace and classname, e.g, mynamespace::MyClass::funcName

Last updated