You may create this file manually, or rely on section Generate your configuration so your AI agent does it itself.
Configure using VSCode extension
With VSCode extension, run command WedoLow: Setup project.
Then, use the configuration window to generate your configuration file.
Configure manually
Supported target platforms
To know the list of supported target platforms, you may simply ask your AI agent:
Show me the list of target platforms supported by WedoLow MCP Server.
The AI agent will make the right call to give you the answer.
Asking Gemini CLI the list of supported target platforms
If your AI agent supports MCP resources, you may also find this information in the list of resources.
Generate your configuration
The simplest way to configure your project is to ask help to your AI agent:
Generate wedolow_mcp_project_config.json config file for WedoLow MCP server using the following informations. My project compiles for the native target platform. It is built with command "make", cleaned with command "make clean" and tested with command "make test", benched with command "make bench". Using "make bench" command. The target platform is the native target. My top function is FFT function in src/fft.cpp file.
You know have a correct configuration file at the root of your project.
Asking Gemini CLI to generate a configuration file for your project
Advanced usage: deactivate optimizations
If you wish to control which optimizations you want your AI agent to search for and apply, you may blacklist specific optimization techniques. To do that, a new section must be added in the configuration file.
If not defined here, optimization techniques default to true (activated).
If you want to control this yourself, ask your AI agent to add this section:
In my wedolow_mcp_project_config.json, add section "optim_techniques" and list all the available optimization techniques there. Default them all to true.
Asking Gemini CLI to add the explicit list of available optimization techniques