said by UnnDunn:
If not, the SFX house that did them should never have gotten the contract.
Corners get cut, it's the nature of the business where everything is pushed to come in on time and under budget.
I would like to point to Star Trek:TMP as an example. Only this time is has to do with SD vs HD. The director's cut of TMP had many new SFX inserts done completely in CGI. They were really well done. They captured the look of the models perfectly and you would be hard pressed to tell the difference between the CGI and model work which is exactly what you want when blending old and new special effects.
However, the renders were only done in SD for the DVD. This was released in 2001. HD was already on the horizon. But the time and the money wasn't there to render for a resolution that wasn't going to be seen at the time so it wasn't done. The source files were allegedly lost so in order to get the director's cut on blu-ray those scenes would have to be recreated from scratch or upscaled.
The irony is, they are going to far more work converting ST:TNG to blu-ray, but that's also a much more profitable venture than a single movie.
Firefly ran in 2002, again right when HD was starting up, but all CGI was rendered in SD as a cost cutting measure. This show has about a rabid and devoted fanbase as you could possibly imagine
This happens all the time. A modern sfx movie could have the effects farmed out to half dozen or more firms. Who's to say 15 years down the line that these firms will be able to locate all the source files for a scene, have software that can even re-render from those files, or even still be in business.
I leave you with a blog quote from someone that works in the field on the subject.
When I was working on Serenity, there was a lot of arguing between the Lightwave and Maya artists about texture resolution. The Maya folks, who had worked on a lot of big movies, swore up and down that, for a movie, your texture maps had to be AT LEAST 4k or else the models would look like crap.
The Lightwave artists, who had mostly come from TV projects, said that was bull**** and lower res maps would hold up just fine.
But something had to be done, because 2 gigs of RAM was the maximum our machines could utilize at the time, and the shots were choking on all those hi-res image maps.
So, we all agreed to do a test. We rendered 4 versions of identical shots in which the models had image maps of 4k, 2k, 1k and 512 pixels. Then we went to a theater and screened them all to see what differences we could visually perceive.
My favorite moment was hearing the Maya guys in the back go wow, I cant even see the difference between the 4k and 512 versions!
Yes, the Lightwave folks walked out of that test screening very smug.
We deciced to use 1k maps for everything, and if an object got REALLY close to camera, wed up it to 2k.
I think we did render everything at 1080, but Im sure, even for a movie, you could get away with 720 and upres it. No one
would know the difference.
These are comments from an industry expert and validated by peers in the industry. Even the guys who do nothing BUT visuals for a living thing anything over than 2k is mostly overkill.
The biggest advantage of 4k for a large format presentation is honestly to minimize screen door effect. Otherwise, the eye is just too poor at picking out detail during motion.