As I already mentioned, the 8 in 8K refers to the digital cinema format being ~8000 pixels wide. Cinema formats are defined by their width, and the height is variable depending on the aspect ratio.
Home formats are the other way around, where they are defined by their height, with a variable width. (thought they mostly seem to be fixed to a 16:9 aspect ratio now)
1080p or "Full HD" is 1920x1080 - 2 megapixel resolution, and close to 2000 pixels wide. This has retroactively been called "2K" by some people. I don't like this term.
The next step up is 3840x2160 - 4x 1080p resolution, or 8 megapixels. Very few people are calling this 2160p - it's usually called "Quad HD", Ultra High Definition TV/UHD, or 4K.
4K is an inaccurate term, but it's a lot better name than 2160p or UHD/UHDTV - that's approaching the ridiculous QWHUVGAXSBS naming scheme
that PC monitors have. I don't mind 4K or QuadHD. When you are thinking ahead, 4K might be the best option, even if it's not completely accurate.
And then we have 8K which is 7680x4320 - 33 megapixels, 16 times the resolution of 1080p.
It gets even more stupid to stick with current naming conventions and call this 4320p.
This is also
considered to be an Ultra High Definition TV format, which is why I don't like the UHD/UHDTV name at all - it can apply to two different displays that have significantly different resolutions.
Super-Hi Vision (SHV) is a good term because it's unique to the format, but 8K is probably the best fit. It's easy to say, and easy to understand.
It's either that, or we just start saying how many megapixels the display has like we do with cameras.,
That is probably the best way to illustrate the difference in resolution between the formats to a layman.
Nowhere in this does "Octo" apply. Please just admit that you made a mistake and move on.