Today I had the task to delete all folders with a given name in a directory tree. Seems easy:
find src -type d -name test -exec rm -r {} \;
In the src
folder, find all subfolders named test
, and remove them together with all their content. This actually kinda works, but the output was not as expected:
find: ‘src/Api/Car/test’: File or folder not found
find: ‘src/Business/CalculationParameter/test’: File or folder not found
find: ‘src/Infrastructure/RestClient/test’: File or folder not found
... many more
What happens here?
Find finds the test folder ‘src/Api/Car/test
‘, executes the rm -r
on it, and then wants to continue by descending into the folder – that was just deleted.
So we need to remember all the folders first, and delete them in a second step. I first thought about creating a temp file, piping all the folders to delete into it and feed that to rm
afterwards. But there is an easier way:
find src -type d -name test -print0 | xargs -0 rm -r
With the print0
option, find will write all output in the form ‘filename\0filename
‘ – meaning with a zero-byte as separator. This we can pipe to xargs, who will split at the zero-byte because of the -0
-option.
Basically the same solution, but in-memory instead of using a temp file.
So now our test code is no longer deployed to production any more, saving us a little disk space and releasing us from that nagging feeling that we should not put code in production that is not needed there.