Monthly Archives: October, 2008

Steve Jobs may be saying “Snow Leopard” quite a bit tomorrow

One of the most widely-reported and plausible rumors floating around Apple’s MacBook refresh is the switch from Intel to NVIDIA chipsets. Obviously, Intel can’t be happy about the loss of sales. If Apple is taking their hardware in the direction I think they are, it’s going to look even worse for Intel than that.

On the surface, the switch from Intel’s GMA X3100 to a GeForce 9000-series chip will boost graphics performance tremendously. Apple usually values gaming performance only for the ability to market the improvement in benchmark scores, but an NVIDIA chip should ensure that any lag around the Stormwind Auction House is the network, not the GPU.

If you look at what NVIDIA is pushing as the next big thing, however, it’s no longer texture fill rate, but parallel processing. This is why NVIDIA acquired Ageia and their PhysX physics API. If you want to see the effect PhysX has on games, the GeForce Power Pack that NVIDIA release this summer makes for a great demo. (Just don’t bother with the CTF-Lighthouse map in Unreal Tournament III unless you have a second GPU dedicated to physics.)

The demo that’s more relevant to Apple users, however, is Badaboom Media Converter, a GPU-capable video transcoding app. As this AnandTech review shows, the app itself is a bit rough around the edges, but the benchmarks reveal that, in principle, using the GPU for video processing saves time and energy over the CPU.

Also consider that Apple had stayed out of the Blu-Ray/HD-DVD format war. From a strategic position, Apple considers online distribution like the iTunes Store to be the future. But they would be foolish to not support Blu-Ray. Compare the quality of a well-mastered BD-50 like Iron Man to any high definition iTunes Store movie, and you’ll understand why.

Apple already offloads MPEG-2 DVD video decoding to the GPU. h.264 requires more horsepower, but at download- and iPhone-friendly bitrates, they’re OK with the hardware they have now. Add Blu-Ray video to the mix, and an Intel GMA chip no longer cuts the mustard.

There’s a greater force than these at work, though. Blu-Ray playback? Power-leveling before Wrath of the Lich King is released? They don’t pay the bills.

Photoshop pays the bills, and Photoshop CS4 is now GPU-accelerated.

See how this is all falling into place now?

When Apple announced Snow Leopard, they mentioned that one of the under-the-covers improvements they’re making to OS X is OpenCL. Apple is seeking to standardize the APIs used to access the GPU for non-3D purposes, like Photoshop, Badaboom, or scientific apps like Folding@Home.

That’s why Apple’s move to NVIDIA chips makes perfect sense. They’re getting these new MacBooks out ahead of Snow Leopard, to ensure that across their entire product line, they’re selling Macs that can utilize OpenCL out of the box. The iMac and Power Mac lines already have GPUs capable of running OpenCL. (The Mac mini, on the other hand, is still limping along with an Intel GMA 950. I have a feeling this is a make-or-break time for the little gateway drug switcher’s special. It has always been built from notebook-spec parts, so it could make the switch to NVIDIA using the same kit as this MacBook refresh. Then again, they could just eliminate the mini entirely.)

So why does this look so bad for Intel? Recall the original announcement of the switch from PowerPC to x86. Everybody was up in arms because the Pentiums of the day were routinely being smoked by PowerPC G5s and AMD Athlons. As details of the x86 Macs trickled out, however, we discovered that Apple got a sneak peek at Intel’s future products. Apple wasn’t basing their products on Ye Olde NetBurst Architecture, but Intel’s upcoming Core chips. The rest is what-we’re-we-ever-worried-about history.

So if Intel is still giving Apple peeks behind the curtain, Steve Jobs must not like what he sees back there. Intel has been bragging up their Larabee platform project as the biggest thing since sliced bread. According to them, the discreet GPU will become obsolete. NVIDIA has scoffed that Intel is finally catching up to the GeForce 7800 in graphics performace, but I think they’re missing the point.

From a gaming point of view, there’s a clear benefit to having separate processors for graphics and physics. That CTF-Lighthouse map I mentioned earlier? It’s a brutal torture test for physics processing. On a GeForce 8800 GTS alone, I couldn’t get it to run. I’d get a black screen with a HUD, and have to terminate the process. The physics processing was choking the scheduler so badly, the GPU could no longer draw the view. On a GeForce GTX 280 alone, it was a slideshow. It was only after I replaced my motherboard (faulty SATA controllers are a bad thing) that I tried using both cards together. With the GTX 280 driving the main monitor and the 8800 GTS handling only physics code, CTF-Lighthouse finally became playable.

I believe that the philosophical differences between Intel and NVIDIA come down to this:

  • Intel is pushing for integration. They want to eliminate the distinction between the CPU and GPU by putting them on the same chip. To them, it’s no different than how the 80486 combined the CPU and FPU in one package. They just can’t do it on a large-enough scale yet.
  • NVIDIA wants to keep those functions separate, because they see strength in numbers. They’d rather sell a pair of GTX 280 cards to a gamer, or a rack full of Tesla S1070s to a university, and let the parallel processors scale independently of the CPU.

Neither side has proven their superiority yet, which puts Apple in a good position, strangely enough. By pushing for OpenCL, they allow OS X and the apps that run on it to adapt to the hardware. If NVIDIA’s separate CPU/GPU combo wins, Apple will already be selling hardware that runs it, including tomorrow’s new MacBooks. If Larabee wins, Apple is still ordering Intel CPUs, so it’s no big deal to switch back. And if AMD’s similar Fusion project steals the show… well, let’s not get too far ahead of ourselves on AMD.

WTAE, why do you hate us?

Because Pittsburgh is Big East territory, we don’t get the Red River Rivalry. While the rest of the country gets to see an epic Texas/Oklahoma tilt, what do we get?

Rutgers at Cincinnati.

People in New Jersey and Ohio don’t want to see that game! Stupid Big East contractual obligations. So what else is on…

Ivy League football on Versus? Cornell at Harvard? This I gotta see.

Thoughts from Day One (North American Version)

  • It’s a Center Ice free weekend. The Golden Corral of hockey TV. Take all you want, just eat all you take.
  • I got home too late to see the banner ceremony from Detroit. Whoop-dee-shit.
  • I also “missed” Def Leppard. Did Joe Elliott really put the Stanley Cup down upside-down? That’s gotta put a hex on the Red Wings, right? It has to.
  • Vesa Toskala is going to steal a few for Toronto this year.
  • Note to any blueliners out there: You are powerless to stop Tomas Holmstrom’s ass. Don’t even try.
  • Games Marian Hossa has escaped karmic retribution: 1
  • Was Vancouver’s offense really that good, or was Calgary Miikka Kiprusoff that bad?
  • The Canucks were certainly inspired by the pre-game tribute to Luc Bourdon.
  • I remember seeing Andre Roy jump over the boards for the Pens and thinking, “I’ve got a baaad feeling about this.” Get used to it, Calgary.
  • Dion Phaneuf is already in Beast Mode. Steve Bernier had the heart of a lion tonight.
  • Saw almost none of Anaheim @ San Jose, but I can say this: George Parros’ mustache is in mid-season form. Magnificent.
  • Caught the last five minutes or so of Boston @ Colorado. Saw Krejci’s game-winner and the mad scramble by the Avs at the end. That one came down to whoever landed the last punch.