I have a very large data set (~7 million lines) from a data logger. I would like to sample the data and copy every 800th line to a new text file. The motive for this is that the text file is too large to import into my analysis software. Ideally, I would like to use a batch script or something similar. Speed is very important, as it is possible that future data files could be even larger.
At one point, I had something kind of working using:
findstr/N . test.txt| findstr ^[0-9]*0: >temporaryFile
FOR /F "tokens=1,* delims=: " %%i in (temporaryfile) do echo %%j > outputFile.txt
Which would keep one line out of every 10. I am not super familiar with the syntax, and this does not currently work as intended.
Edit:
The solution put in by @LotPings works well