I had a SQL file with 19M rows, one insert on each row, and I wanted to split it into smaller chunks. I found out that there is a
split command which does exactly that:
split -l 100000 -d --additional-suffix=.sql mv-ids.sql mv-ids-
mv-ids.sql file as an input and produces
mv-ids-nn.sql files where each file has 100k lines. In my case nn goes from 00 to 89 and then from 9000 onwards. Good enough.