Tag Archives: kinect

Kinect GUI for Minecraft and others..

In Semester 1 (March to June) and from July Karen Miller of the Library Makerspace and Information Studies and myself are ‘clients’ for Curtin software engineering students. Their brief is to build a flexible Graphic User Interface (GUI) that connects the Microsoft Kinect 1 camera to various game engines like Minecraft so that non-programmers can easily select and modify their own gestures to a command library in the virtual world/game level.

The forerunner of this project coded by Jaiyi Zhu was cited in the NMC Technology Outlook Horizon Report. Dr Andrew Woods, HIVE manager wrote:

Congratulations Karen Miller, Erik Champion and Jaiyi Zhu on having their work cited in the NMC Technology Outlook Horizon Report < https://t.co/YeZMHU76gI >
This project was supported by the 2015 HIVE Summer Internship Program and I’m very happy this great project and Jiayi’s hard work is being acknowledged. https://maker.library.curtin.edu.au/2016/02/19/minecraft-edu-in-the-library-makerspace/

Minecraft in Stereo and camera-adjusted for a curved screen

Problem: We have a Kinect+Minecraft prototype but no code to calibrate it for a curved or cylindrical screen.

If Java and Open GL the minecraft prototype might work to run it in stereo
https://forums.geforce.com/default/topic/769009/3d-vision/minecraft-in-3d-vision-updated-to-1-8-x/

What is the current version of Minecraft? Java (OpenGL) or Minecraft Win10 (Pocket edition) Direct X 12?
I have just been told our version uses Java, One good bit of news for the day!
My hunch is the Open GL code from Charles Henden‘s project https://www.academia.edu/1003311/A_Surround_Display_Warp-Mesh_Utility_to_Enhance_Player_Engagement)
will allow us to run a Minecraft mod on a curved (or even asymmetrical) screen. But only in Open GL.
Combining that with stereo may pose more challenges but even reconfigurable surface warping would be a great start. However I have been reminded not to use the word warp for this, true, it is adjusting the camera for a half-cylindrical screen:

http://paulbourke.net/dome/

Decisions, decisions.
And there is still projection mapping to be considered! Like

Video:

Oh and maybe it is time to develop our own portable curved screen. Is stereo 3D necessary? Hmm…

Ideas on how to adapt Kinect camera tracking for 3D presentations in archaeology

I did not mention all these in my 22 May presentation at Digital Heritage 3D conference in Aarhus (http://conferences.au.dk/digitalheritage/)

But here are some working notes for future development:

How Xbox Kinect camera tracking could change the simulated avatar:

  1. Avatars in the simulated world change their size clothing or inventories – they scale relative to typical sizes and shapes of the typical inhabitants, or scale is dependent on the scene or avatar character chosen.
  2. Avatars change to reflect people picking up things.
  3. Avatars role-play – different avatars see different things in the digital world.
  4. Narrator gestures affect the attention or behavior of the avatar.

How Xbox Kinect camera tracking could change the simulated world or digital objects in that world:

  1. Multiple players are needed to lift and examine objects.
  2. Objects move depending on the biofeedback of the audience or the presenter.
  3. Interfaces for Skype and Google hangout – remote audiences can select part of the screen and filter scenes or wire-frame the main model.
  4. Levels of authenticity and time layers can be controlled or are passively / indirectly affected by narrator motion or audience motion / volume / infrared output.

abstract: Motion Control For Remote Archaeological Presentations

My abstract for 21 May talk at the Digital Heritage 3D representation conference at Moesgaard Museum Aarhus Denmark

Title: Motion Control For Remote Archaeological Presentations

Displaying research data between archaeologists or to the general public is usually through linear presentations, timed or stepped through by a presenter. Through the use of motion tracking and gestures being tracked by a camera sensor, presenters can provide a more engaging experience to their audience, as they won’t have to rely on prepared static media, timing, or a mouse. While low-cost camera tracking allow participants to have their gestures, movements, and group behaviour fed into the virtual environment, either directly (the presenter is streamed) or indirectly (a character represents the presenter).

Using an 8 metre wide curved display (Figure 1) that can feature several on-screen panes at once, the audience can view the presenter next to a digital environment, with slides or movies or other presentation media triggered by the presenter’s hand or arm pointing at specific objects (Figure 2). An alternative is for a character inside the digital environment mirroring the body gestures of the presenter; where the virtual character points will trigger slides or other media that relates to the highlighted 3D objects in the digital scene.

Acknowledgement: I would like to thank iVEC summer intern Samuel Warnock for kicking off the prototype development for me and Zigfu for allowing us access to their SDK.

Figure 1. Screenshot of stereo curved screen at the HIVE, Curtin University.

Figure 2. Screenshot of prototype and pointing mechanism at the HIVE, Curtin University.

Skyrim on PC with Kinect and Kinect One

FROM: http://projects.ict.usc.edu/mxr/faast/

Have a Kinect for Windows v2?

We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only). Please note that you must already have the Microsoft Kinect SDK v2 installed and the KinectService application running. This is based on preliminary software and/or hardware, subject to change.

FAAST is currently available for Windows only.

Video Demos: http://projects.ict.usc.edu/mxr/faast/faast-video-gallery/
Video of Skyrim and Kinect for PC: https://www.youtube.com/watch?feature=player_embedded&v=Z83wzJwrBK0

From http://www.dwvac.com/

The VAC system is a useful program which you use to issue commands to your flight simulator , role playing game or any program. Since you have your hands full while playing those busy games you can now put your voice to work for you. Use your voice to speak words or phrases to issue commands to you favorite games. VAC uses a unique method in phrase recognition which greatly reduces unwanted issued commands caused by ambient noises.

PS4 camera appears to work with Mac OS X

As an alternative to Kinect One (which requires Windows 8) the PS4 eye camera has some interesting features/functions although it is lower res and does not have an Infra Red (IR) blaster..
And can cost $85 AUD or cheaper online or via JB hifi…

How does it compare to Microsoft’s Kinect One camera?
Review in 2013:http://au.ign.com/blogs/finalverdict/2013/11/02/xbox-one-vs-playstation-4-kinect-20-vs-playstation-4-camera
Review in 2014: http://www.techradar.com/au/news/gaming/consoles/ps4-vs-xbox-720-which-is-better-1127315/5#articleContent
A not so positive review: http://www.techradar.com/au/reviews/gaming/gaming-accessories/playstation-4-camera-1202008/review

One possible future use for PS4 eye camera, the VR Project Morpheus: http://www.techradar.com/au/reviews/gaming/project-morpheus-1235379/review

http://www.psdevwiki.com/ps4/PlayStation_4_Camera

Available functions

  • photo, video
  • voice commands (available as well with an earset with microphone)
  • depth calculation/imaging
  • pad, move, face, head and hand recognition/tracking
  • one of the cameras can be used for generating the video image, with the other used for motion tracking.

http://www.psdevwiki.com/ps4/Talk:PlayStation_4_Camera

Features (Could be used for eye tracking..)

  • automatic black level calibration (ABLC) support 2×2 binning
  • programmable controls for frame rate, mirror and flip, standard serial SCCB interface cropping and windowing
  • image quality controls: lens correction and defective pixel canceling two-lane MIPI/LVDS serial output interface
  • embedded 256 bits one-time programmable (OTP)
  • memory for part identification, etc.
  • supports output formats: 8/10/12-bit RAW RGB on-chip phase lock loop (PLL)(MIPI/LVDS)
  • supports horizontal and vertical sub-sampling programmable I/O drive capability
  • supports images sizes: 1280×800, 640×400, and 320×200
  • built-in 1.5V regulator for core
  • support alternate frame HDR / line HDR
  • fast mode switching

What can you do with it?
http://ps4eye.tumblr.com
/
“Bigboss (@psxdev) has successfully streamed video data from the PS4 camera to OS X!”
Picture at https://twitter.com/psxdev/status/439787015606136833

Drivers for PS4
http://bigboss-eyetoy.blogspot.co.uk/2014/09/ps4eyecam-released.html
Links to https://github.com/bigboss-ps3dev/PS4EYECam/
“It is the first public driver for PlayStation 4 Camera licensed under gpl.”

Historical
PS3Eye for Mac: http://webcam-osx.sourceforge.net/

Kinect SDK 2 FINGER TRACKING (etc) for Desktops & Large Screens (VR)

We are trying to create some applications/extensions that allow people to interact naturally with 3D built environments on a desktop by pointing at or walking up to objects in the digital environment:

or a large surround screen (figure below is of the Curtin HIVE):

using a Kinect (SDK 1 or 2) for tracking. Ideally we will be able to:

  1. Green screen narrator into a 3D environment (background removal).
  2. Control an avatar in the virtual environment using speaker’s gestures.
  3. Trigger slides and movies inside a UNITY environment via speaker finger-pointing Ideally the speaker could also change the chronology of built scene with gestures (or voice), could alter components or aspects of buildings, move or replace parts or components of the environment. Possibly also use Leap SDK (improved).
  4. Better employ the curved screen so that participants can communicate with each other.

We can have a virtual/tracked hand point to objects creating an interactive slide presentation to the side of the Unity environment. As objects are pointed at information appears in a camera window/pane next to the 3D digital environment, or, these info windows are triggered on approach.

A commercial solution to Kinect tracking for use inside Unity environments is http://zigfu.com/ but they only appear to be working with SDK 1. Which is a bit of a problem, to rephrase:

Problem: All solutions seem to be Kinect SDK 1 and SDK 2 only appears to work on Windows 8. We use Windows 7 and Mac OS X (10.10.1).

So if anyone can help me please reply/email or comment on this post.

And for those doing similar things, here are some links I found on creating Kinect-tracked environments:

KINECT SDK 1
Kinect with MS-SDK is a set of Kinect examples, utilizing three major scripts and test models. It demonstrates how to use Kinect-controlled avatars or Kinect-detected gestures in your own Unity projects. This asset uses the Kinect SDK/Runtime provided by Microsoft. URL: http://rfilkov.com/2013/12/16/kinect-with-ms-sdk/
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my friend Jonathan O’Duffy from HitLab Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-packages.

NB
KinectExtras for Kinect v2 is part of the “Kinect v2 with MS-SDK“. This package here and “Kinect with MS-SDK” are for Kinect v1 only.

BACKGROUND REMOVAL (leaves just player)
rfilkov.wordpress.com/2013/12/17/kinectextras-with-mssdk/

FINGER TRACKING (Not good on current Kinect for various reasons)

  1. http://www.ar-tracking.com/products/interaction-devices/fingertracking/
  2. Not sure if SDK 1 but FingerTracker is a Processing library that does real-time finger-tracking from depth images: http://makematics.com/code/FingerTracker/
  3. Finger tracking for interaction in augmented environments: Finger tracking for interaction in augmented environments OR https://www.ims.tuwien.ac.at/publications/tr-1882-00e.pdf by K Dorfmüller-Ulhaas – a finger tracker that allows gestural interaction and is sim- ple, cheap, fast … is based on a marked glove, a stereoscopic tracking system and a kinematic 3-d …
  4. Video of “Finger tracking with Kinect SDK” see https://www.youtube.com/watch?v=rrUW-Z3fHkk
  5. Finger tracking using Java http://www.java2s.com/Open-Source/CSharp_Free_Code/Xbox/Download_Finger_Tracking_with_Kinect_SDK_for_XBOX.htm
  6. Microsoft can do it: http://www.engadget.com/2014/10/08/kinect-for-windows-finger-tracking/ Might need to contact them though for info

HAND TRACKING FOR USE WITH AN OCULUS RIFT
http://nimblevr.com/ For use with rift
Download nimble VR http://nimblevr.com/download.html Win 8 required but has mac binaries

Kinect for the Web Through Javascript – KinectHacks.net

I cannot keep up and won’t try to keep up with these kinect hacks, after this one. Nice though to hear Microsoft say (via pcmag) they are happy for the ‘modding’ community to use the usb connection as long as they don’t interfere with the kinect software itself. Interesting though, that you don’t need a PC or an XBox, a Mac will do.

http://kinecthacks.net/kinect-skeleton-test/
I think this is very cool but I am easily impressed these days.

http://kinecthacks.net/kinect-for-the-web-through-javascript/

Fluid Interfaces Group have developed an open-source Chrome extension that makes it possible for Javascript to talk to the Kinect. It’s called DepthJS. This means that web sites can provide custom interfaces to everyone with a Kinect and the DepthJS extension installed without requiring them to install any new software. Check out the video:

world’s greatest shadow puppet xbox kinect hack

http://www.engadget.com/2010/11/19/kinect-hack-creates-worlds-greatest-shadow-puppet-video/“..installation prototype created by Emily Gobeille and Theo Watson using an Xbox Kinect connected to a laptop using the libfreenect Kinect drivers and ofxKinect. The openFrameworks system tracks the elbow, wrist, thumb, and tips of the fingers to map a skeleton onto the movement and posture of an animated puppet.”

motion controller/sensors on game consoles

I am very interested in xbox kinect, playstation move and nintendo wii (or wiimote plus) reviews, especially to see if and how they can be modded..here is an early review http://www.cnet.com.au/xbox-360-kinect-vs-playstation-move-vs-nintendo-wii-remote-plus-339307363.htm?feed=rss
here is a youtube video of using xbox move as a form of 3D scanner http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/index.html
an article on the above at http://hackaday.com/2010/11/15/rendering-a-3d-environment-from-kinect-video/

nb (wiki) From a design standpoint, Ars Technicas review expressed concern that the core feature of Kinect, its lack of controller, would hamper development of games beyond those that have either stationary players, or control the player’s movement automatically; remarking that the similarity of the genres of the launch titles owed to the hardware not being able to “handle much else”, they predicted that Microsoft would eventually need to release a “Move-like” controller in order to overcome this limitation.[76]