Well, let’s think of it this way. Mathematical operations on colors in gamma space are incorrect. If I take white (255,255,255) and divide it by 2, I get (127.5,127.5,127.5), right? You might expect this to be half as bright. It’s not, though.
Monitors show you color in gamma space because weird reasons I won’t go into. When you do math with colors in shaders, you must convert to linear space first or everything you do is wrong. Normals won’t look right, lighting and shading will be too dark or too bright. Fortunately, conversion is simple… but keeping track of it isn’t. This issue is compounded by the fact that most image formats are in gamma space, and you need to convert back to gamma space when rendering.
Sure, it might be simple enough to make the distinction in your API. Maybe LinearColor3, for example, with conversion functions in between. But it’s a mindset change; 0-255 makes no sense, you really want 0.0-1.0. Linear colors don’t look like you expect them too, either, and different color gamuts make the problem more complicated.
I’m not really the best at explaining this, so TL;DR: linear color space is good for math, gamma color space is not.