Perceptual Computing, by Intel

July 25 2013

I attended a SIGGRAPH session on Perceptual Computing yesterday, and learnt a few things Intel is doing to get people excited about it. First, they gave away a few Creative Senz3D depth/gesture cameras, and I was lucky to get one! Second they are running a competition tiled Perceptual Computing Challenge with a $1 million prize pool, and they commissioned seven developers to give their best shot at building new groundbreaking app/demo/prototype embracing Percpetual Computing technologies. Third, the SDK is supported (or working nicely) with quite a number of frameworks: Unity, Processing, OpenFrameworks, Cinder, and many others (github page).

Personally, I feel it’s great to see such a massive push, but it seems totally technology driven, with not much emphasis on the quality of the resulting experiences. But hey, it’s Intel and they make silicon chips, so it’s not really a surprise. Tracking seems shaky/jittery still, and used as building blocks to evolve interfaces and interactions, the results are generally not quite reliable or stable, with tons of broken (virtual) limps and other inconsistencies or difficult boundary conditions. So I’m not sure we are exactly there yet!

Their Human Interface Guidelines document is worth reading, even if you don’t plan building apps yourself.

More info:

July 26 2013
Miguel Peres permalink

Unfortunately, while they were still accepting entries, the competition was not open for residents of Sweden.

Leave A New Comment

Captcha Challenge * Time limit is exhausted. Please reload CAPTCHA.