Ideas on how to adapt Kinect camera tracking for 3D presentations in archaeology

I did not mention all these in my 22 May presentation at Digital Heritage 3D conference in Aarhus (http://conferences.au.dk/digitalheritage/)

But here are some working notes for future development:

How Xbox Kinect camera tracking could change the simulated avatar:

  1. Avatars in the simulated world change their size clothing or inventories – they scale relative to typical sizes and shapes of the typical inhabitants, or scale is dependent on the scene or avatar character chosen.
  2. Avatars change to reflect people picking up things.
  3. Avatars role-play – different avatars see different things in the digital world.
  4. Narrator gestures affect the attention or behavior of the avatar.

How Xbox Kinect camera tracking could change the simulated world or digital objects in that world:

  1. Multiple players are needed to lift and examine objects.
  2. Objects move depending on the biofeedback of the audience or the presenter.
  3. Interfaces for Skype and Google hangout – remote audiences can select part of the screen and filter scenes or wire-frame the main model.
  4. Levels of authenticity and time layers can be controlled or are passively / indirectly affected by narrator motion or audience motion / volume / infrared output.

Leave a comment