It's not a matter that 4K games are "made", it's just what the hardware is capable of. PC games let you choose the output resolution. So if you're playing a game that doesn't run very well on your PC you can run it at 1600x900 or 1280x720 (lower than the 1080p standard). If you've got a nice rig and have a 4K monitor you can easily tell a game to display at 3840x2160 resolution, you just have to have some amazing hardware power to be able to drive such graphics.
You can imagine that say a PC game that does many calculations per pixel per second, you'd imagine fancy shaders, high res textures, shadows, bloom, materials, animation, physics, ect. going into each pixel 60 times a second (60 fps is preferred for games). Take a typical resolution 1080p, the amount of approximate calculations needed is 1920x1080=2073600x(whatever calculations per pixel)x60fps=how much processing power per second is needed. The Unreal Engine Demo http://www.youtube.com/watch?v=wdwHrCT5jr0
back in 2011, required at least 3 TERAFLOPS to run at 30 FPS. The next gen consoles have around 1.2 to 1.8 TERAFLOPS... And that's just for a 1080p 30 fps game with pretty close to prerendered CG level graphics. You would need roughly 4x more processing power to achieve the same level of graphics at 4K resolution and still only 30 fps.
So yeah, even IF the next gen consoles had native 4K games, the calculations per pixel would have to be very low and as a result, the graphics won't be as good. Geometries will be sharper and have less jaggies, but it still won't look as good as a 1080p game with higher level of graphics and a bit of anti-aliasing (much more processing power friendly and produces better "bang for buck"). 4K gaming is a LONG way off. Only people with higher end PC rigs nowadays can get decent graphics AND performance at 4K resolutions. I can tell you right now that the next gen consoles will not be able to do any spectacular 4K graphics given that it can't even do a tech demo that was run over a year ago on a single high end PC card. The next gen would be able to decode 4K BluRays no problem but rendering 4K graphics is a whole other story.
I mean, hell. My GTX 670 from over a year and some months ago has 2.5 Teraflops in comparison to the 1.2 to 1.8 on the next gen of consoles. No amount of software and OS optimization can make up for that gap. So my year and some month old GTX 670 can produce better graphics than the next gen consoles. And if you have a GTX 680, which produces over 3 Teraflops and is older than the GTX 670, it can produce the same graphical performance of the Xbox One and PS4 COMBINED. People are seriously over estimating the graphical performance and fidelity of the next gen consoles. PC games are going to benefit greatly from the next gen consoles. Developers will be developing NATIVE directx 11 and not having to compromise the fundamental graphics so it would be able to scale and run on the current gen of consoles will result in vastly superior graphics for PC versions of multiplatform games.