Is it wise to use robots.txt to block pagination URLs?

Google is indexing certain posts over and over simply because some posts are assigned multiple tags.

https://www.domain.com/tag/video/page/2/

Since all of my content is visible to search engines via the sitemap, is it wise to adjust robots.txt to exclude URLs that have the word page in them like this?

Disallow: /tag/*page*

And if this is the right thing to do, then I just need to make a new robots.txt file, based on the one already on my server, include it at the root level of the custom theme files, zip it up, and then upload the zipped file to Ghost dashboard? Is this correct?