"Too many open files" errors in many contexts

I’m using a large dataset via the lambda cloud storage. This stores somewhere around 10 million files at around ~3TB of storage.

I am doing some basic processing on these files, but can only process a certain number of file directories at time before running into “Too many open files” error. This error persists in all other applications and programs, and only goes away after restarting the instance.

This has happened when using aws s3 sync, and doing some basic processing in some python programs, so has shown in up a number of contexts.

I have tried to diagnose this various commands from ulimit and lsof with no success.