linux, parsing,

Get, monitor and remove the last N lines of a file in Linux

Jan 11, 2023 · 2 mins read · Post a comment

Listing and monitoring the last N lines of a log file is a common and familiar operational practice mostly to System Administrators and SRE engineers. Surely, there are better ways to monitor log files, by implementing the popular log aggregation and monitoring stacks such as ELK and EFK, though the focus on this post will be related to getting the last N number of lines on the spot, SSH-ed into a live running production server.

Prerequisites

  • Bash environment

Solution

get the last N number of lines of a file

Example retrieving the last 50 lines from a Nginx error log file:

tail -n50 /var/log/nginx/error.log

Fetch the error log lines and save them to a temp file:

tail -n50 /var/log/nginx/error.log > /tmp/nginx.error.log

monitor the last N number of lines of a file

watch tail -n 10 /var/log/nginx/error.log

Use -n if you want to get updates every N seconds. For instance, polls every 15 seconds, the last 10 lines:

watch -n 15 tail -n 10 /var/log/nginx/error.log

To highlight the differences between updates, use -d.

The final command should look like the following one:

watch -d -n 15 tail -n 10 /var/log/nginx/error.log

remove the last N number of lines of a file

To stress things out, never remove log files of a live production server. With that being said, the most efficient way to delete some lines of a file would be using the head command. Let’s say we want to remove the last 5 lines of a file. Run the following command:

head -n -5 somefile > tmp && mv tmp somefile

If head complain with some error while running it on macOS, replace it with ghead instead. But, first you’ll need coreutils installed on your machine though.

brew install coreutils

Conclusion

Feel free to leave a comment below and if you find this tutorial useful, follow our official channel on Telegram.