Getting Clang to run inside a Visual Studio Code remote container on Docker is surprisingly difficult, but I’ll take you through all the steps you need.
Why Docker And VS Code?
If you’re reading this tutorial then you’ve probably already decided that this is a good combination for you, but just in case you are on the fence, here are some reasons that you might want to use this setup:
- A completely separate build environment. Because your application and compiler live within a Docker container, they are kept completely separate from your main system. This means no conflicts with libraries, no messing around with having multiple versions of compilers installed, and a build environment that can be recreated by other developers on your team with a single command.
- Build and debug under any Linux distribution – from within Windows. You get the convenience of your Windows desktop while seamlessly compiling, running, and debugging your application in Linux.
- Less overhead than virtual machines. Although Docker containers are similar in concept to a virtual machine, they don’t have the same resource requirements. This can save a lot of RAM, especially when running several instances.
- VS Code Remote Development. VS Code is really two pieces of software: a “client” that displays the user interface, and a “server” that writes your files to disk, runs the debugger, and so on. Because of this partitioning, the server and client can run on different machines. This can be over an SSH connection, in WSL, or in a Docker container. It allows you to use VS Code as if it was operating on your local machine, when actually it’s doing the real work elsewhere.
Prerequisites
You’ll need a few pieces of software installed before we get started:
- Visual Studio Code, of course.
- WSL2, Windows Subsystem for Linux. If you have version 1 installed, you’ll need to upgrade.
- Docker Desktop. Make sure it’s set in WSL2 mode in the settings.
You’ll also need the Remote Development extension pack installed in VS Code.
Getting Started
Load up a shell in your WSL. You can do this by typing wsl
in PowerShell or at a Command Prompt, or you can install the excellent and free Windows Terminal from the Microsoft Store. If you’re using Windows Terminal it will default to PowerShell, so click the “v” at the top and select “Ubuntu” (or whichever distro you chose when installing).
Create a directory for your project in WSL and launch Visual Studio Code. This will launch VS Code in a split client/server mode: the front-end user interface running in Windows, but the back-end running in WSL. A little later we’ll switch to running the back end in a Docker container.
I’ll call my project directory “clang-example” for this example. You can call it whatever you like.
cd ~
mkdir clang-example
cd clang-example
code .
Create Your Dockerfile
In VS Code, create a new file and name it Dockerfile
(exactly like that: uppercase D, no file extension). The contents will be:
FROM alpine:latest
RUN apk --no-cache add clang llvm llvm-dev llvm-static lldb lldb-dev g++ git gcompat cmake make py3-lldb
The FROM
line tells it to start with a base image of Alpine Linux. Alpine is a minimalistic distribution that will keep your Docker container file sizes small, and is used as a base for many Docker projects.
The next line tells it to run apk
, Alpine’s package manager (equivalent to yum
, apt
, etc.), and install several pages. The --no-cache
option prevents it from needlessly caching files and bloating the size of your container. I’ll explain why each package is required.
clang
: This should be obvious, it’s the C/C++ compler.llvm
: This is the “back end” for the Clang compiler.llvm-dev
: This providesLLVMConfig.cmake
so you can compile tools that integrate directly with LLVM, ie. LLDB-MI.llvm-static
: This provideslibLLVMDemangle.a
, also needed for compiling LLDB-MI.lldb
: The debugger for LLVM.lldb-dev
: This provideslib_lldb
, which is needed for compiling LLDB-MI.g++
: Clang doesn’t actually provide standard headers such asiostream
, so you need to installg++
for these.git
: This provides an easy way to get LLDB-MI, and it will integrate with VS Code in your project.gcompat
: The VS Codecpptools
extension needs this in order to run – it expects GCC’s standard library to be installed, but Alpine Linux uses the alternative Musl standard library. This package provides a compatibility shim between the two.cmake
: This is needed for compiling LLDB-MI, and probably your own project as well.make
: CMake will produce aMakefile
, whichmake
will then use to build the project.py3-lldb
: This is a Python module used in debugging. Without it, you will get the error “No module named ‘lldb'”
Phew, that was a lot of packages, and it took a few hours for me to discover all these dependencies…
Build & Switch To the Container
In VS Code, press Ctrl + Shift + P
to bring up the task list, and select “Remote Containers: Open Folder in Container...
“, and press “OK
“, then select “From Dockerfile
“. This will start building your Docker container, which will take a few minutes. Click “(show log)
” on the popup to keep an eye on the progress. Between g++, llvm, and Clang, you can expect it to download a few hundred megabytes. Docker caches the result of each command in the Dockerfile, so if you want to add more packages later you can add a separate RUN
command instead of just appending it to the existing one, and then it won’t have to redownload all the compiler packages.
If you make any changes to the Dockerfile later then you can rebuild your container then you can select “Remote Containers: Rebuild Container”.
Install Extensions in the Container
Because the VS Code client is now running inside your Docker container, it needs to have extensions installed there as well. During the setup process it created a file .devcontainer/devcontainer.json
– load that up and edit the extensions
key to add Microsoft C/C++ Tools, CMake Tools, and any others you require:
{
"extensions": [
"ms-vscode.cpptools",
"ms-vscode.cmake-tools"
],
}
After editing the file, select “Remote Containers: Rebuild Container” from the task list.
Building LLDB-MI
LLDB-MI provides the interface between VS Code and the LLDB debugger, allowing you to step through code, set breakpoints, and so on. This used to be part of LLDB but was spun off into a separate project. There is currently no package available for it in Alpine, so you have to build it yourself. Open a terminal window in VS Code (Ctrl + ‘). Make sure you are in your project directory, for example /workspaces/clang-example/
, and execute the following commands to download and build the tool:
git clone https://github.com/lldb-tools/lldb-mi.git
cd lldb-mi
cmake .
cmake --build .
Configuring the Debugger with launch.json
VS Code has to be told which debugger to use, and this is done in the launch.json
configuration file. Click on the debugging panel in VS Code (the play button with a bug on the left), and select “create a launch.json file”. Select “C++ (GDB/LLDB)” (if you do not see this, make sure the C++ extension is installed properly). You may get an error saying it is unable to open the file, in which case just try again and it should work. You’ll now need to make some edits to the launch.json
file:
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "LLDB Launch",
"type": "cppdbg",
"request": "launch",
"program": "${workspaceFolder}/a.out",
"args": [],
"stopAtEntry": true,
"cwd": "${fileDirname}",
"environment": [],
"externalConsole": false,
"MIMode": "lldb",
"miDebuggerPath": "${workspaceFolder}/lldb-mi/src/lldb-mi",
"logging": {"engineLogging": true, "trace": true, "traceResponse": true},
"setupCommands": [
{
"text": "setting set target.disable-aslr false",
"description": "Fix packet returned error 8",
"ignoreFailures": false
}
]
}
]
}
The things to change are:
- Change
program
to your program’s executable path. - Change
MIMode
tolldb
. - Add
miDebuggerPath
. - Modify the
setupCommands
.
After compiling your application, you can now use the “Play” button on the debugging tab to launch it under the debugger.
That’s It!
That’s everything – you now have CMake, the Clang compiler, and the debugger – all in Docker and hooked up for remote developing with VS Code.
If you have closed VS Code and want to resume development, go to WSL and navigate to your project directory, then run “code ."
When VS Code launches, you will get a dialog with a button to “Reopen in Container”. Press this and it will launch the Docker container and open the VS Code server within it.
Troubleshooting
CMake Fails to find Build Tools
CMake was unable to find a build program corresponding to “Unix Makefiles”. CMAKE_MAKE_PROGRAM is not set. You probably need to select a different build tool.
CMake requires that the make
package is installed.
Clang++ Fails to Find Standard Headers
“fatal error: ‘iostream’ file not found”
Clang requires that the g++
package is installed to provide standard headers.
Couldn’t start client cpptools
You get the error “Couldn’t start client cpptools”, and the Output windows shows:
[Error – 6:15:14 PM] Starting client failed
Launching server using command /root/.vscode-server/extensions/ms-vscode.cpptools-1.7.1/bin/cpptools failed.
Trying to run cpptools
from the terminal says File not found
.
This occurs under Alpine Linux because cpptools
requires the g++
and gcompat
packages to be installed.
The LLDB Debugger Hangs When Launched
When you view the Debug Console in VS Code, you see that the debugger is hung at Wait for connection completion.
Load launch.json
and set "externalConsole": false
.
The LLDB Debugger Immediately Exits
When you view the Debug Console in VS Code, you see:
ERROR: Unable to start debugging. Unexpected LLDB output from command “-exec-run”. ‘A’ packet returned an error: 8
This is because the debugger is trying to disable ASLR (Address Space Layout Randomization), but cannot due to Docker’s default security settings. GDB ignores this failure, but it causes LLDB to exit.
Load launch.json
and add the setupCommands
given in “Configuring the Debugger with launch.json”.
The LLDB Debugger Gives “No module named ‘lldb'” or “‘run_one_line’ is not defined”
When you view the Debug Console in VS Code, you see:
Traceback (most recent call last):
File “<string>”, line 1, in <module>
ModuleNotFoundError: No module named ‘lldb’
Traceback (most recent call last):
File “<string>”, line 1, in <module>
NameError: name ‘run_one_line’ is not defined
This is caused by the LLDB Python module not being installed. Check the package list in the instructions above, and add py3-lldb
.