Category Archives: Biofeedback

free Critical Gaming eBook for 7 days

Critical Gaming: Interactive History and Virtual Heritage  (2015 edition) is in a Routledge campaign for May (2020), which allows anyone to register and get free access to the book (via this link) for 7 days. After this 7-day period, they can buy a copy for £10/$15!  *Trust me this is a lot cheaper than before!

Also check out the official Routledge History, Heritage Studies etc. Twitter page

Is there a catch? I honestly don’t know but don’t think so!

AR/MR case studies and zombies on a dig

Why do we use augmented reality for heritage? To show what is not there, navigate and orient people, to reveal what is created intangibly by our indirect actions, or to reveal our impact on material remains..

But AR/MR/games can reveal archaeological methods along with intrinsic reasons to play games, zombies!

Zombies are slow and can be animated or rendered clumsily; they provide a protagonist on limited AI resources; they are associated with death, decay and the past. We have some experience with zombies and biofeedback or skeletons and archaeology..

  • Example: Library Skills, Archival and archaeology methods
  • Goal: The goal can be serious exploration; but with imaginative constraints and settings.
  • Game mechanic: For example: dig up zombie, match to correct time using dating methods
  • Feedback: If correctly matched to time period, zombies are animated and run amok.
  • Setting: archaeological dig, a mortuary or a library.
  • Affordance: Find artefacts that placate zombies; mortuaries require following correct rituals to rebury zombies; library archives inform player of artefacts of value to zombies-find books of power to protect against zombies.
  • Reward: Videos or machinima augmented glimpses of potential past/individual narratives.
  • Game platform: does it have to be 3D? Could it be designed in minecraft (open source or otherwise), minetest, or terrania? Augmented reality: how could it be involved? Oh I have some ideas but that would be telling and I’d have to charge..

Curtin Cultural Makathon

Thanks to a Curtin MCCA Strategic Grant six reseachers and Library staff at Curtin University bought Virtual Reality and Augmented Reality equipment and ran two events to help staff develop digital prototypes and experiences using cultural data resources and digital humanities tools and techniques

  1. 26/08/2016 (AM) GLAM VR: talks on Digital heritage, scholarly making & experiential media (26/08/2016 (AM) 49 registrations-twitter: #GLAMVR16
    THEN Cultural Datasets In a Game Engine (UNITY) & Augmented Reality Workshop 6/08/2016 (PM) 34 registrations
  2. Curtin Cultural Makathon (11/11/2016) 20 registrations-twitter: #ccmak16 OH and before the Makathon, there was a TROVE API workshop! Or read Kathyrn Greenhill’s notes.

Our Curtin Cultural Makathon, great fun, four finished projects, excellent judges and data mentors, fabulous colleagues and atmosphere, plus pizza! Must do again but with more 3D and entertainment technology! Slides: http://slides.com/erikchampion/deck-4#/

There are also GLAMVR16 slides: http://slides.com/erikchampion/glamvr16-26-08-2016#/

Yes you can control the slides.com slides from your phone! if you like the slides.com technology, check out http://lab.hakim.se/reveal-js/

Want Western Australian / Australian datasets for your own hackathon? http://catalogue.beta.data.wa.gov.au/group/about/curtin-cultural-makathon

 

Curtin Cultural Makathon

Hack/slash/cut/bash/scrape/mod/mash – it’s a culture thing

Join the School of Media, Culture and Creative Arts and Curtin Library Makerspace to hack cultural datasets and heritage information.

Use government and institutional research data, gallery, library, archive and museum information as data sources. Experiment with data for a research project or proposal; create something accessible, beautiful and/or useful using craft, games, augmented or virtual reality, apps or something else: it’s up to you.

Date:    Thursday 10 November 2016 (afternoon) & Friday 11 November 2016 (9am – 5pm)

Location: Makerspace, level two, Robertson Library (building 105), Curtin University,  Kent Street, Bentley

Registration: Free via Eventbrite

For more information visit the Curtin Cultural Makathon website.

To volunteer to assist with data or to sponsor a prize please contact Dr Lise Summers or Dr Karen Miller.

Curtin Cultural Makathon is funded by a MCCA strategic grant. For more details on the project contact Professor Erik Champion.

Digital Heritage, Scholarly Making & Experiential Media

Our internal small grant (School of Media Culture and Creative Arts, Curtin University) was successful!

Here is a synopsis of the application (redacted):

Digital Heritage, Scholarly Making & Experiential Media

We propose

  • A one-day workshop [Friday 26 August 2016, HIVE] with 3D, Digital APIs, UNITY and Augmented Reality workshops.
  • We will present our projects at that workshop and a month later meet to review progress and each other’s publications and grants.
  • Then we will organize with the Library and other GLAM partners a cultural hackathon in Perth where programmers and other parties spend a day creating software prototypes based on our ideas from the workshop. The best project will win a prize but the IP will be open source and contestants may be invited into the research projects or related grant applications.
  • Equipment to build prototypes and showcases for future grants. Part of the money will also go into Virtual Reality headsets, and Augmented Reality equipment that can be loaned out from the MCCA store to postgraduates and students.

The above would help progress the below research projects:

  • Another need is to develop the maker-space and digital literacy skills in information studies and the Library Makerspace, to develop a research area in scholarly making.
  • Another project is to integrate archives and records with real-time visualisation such as in the area of digital humanities scholarship, software training in digital humanities, and hands on workshops and crafting projects at the Curtin University Library.
  • Another project is to explore how SCALAR can integrate 3D and Augmented Reality and create a framework for cloud-based media assets that could dynamically relate to an online scholarly publication and whether that journal in printed form, with augmented reality trackers and head mounted displays could create multimedia scholarly journals where the multimedia is dynamically downloaded from the Internet so can be continually updated. Can this work inform future developments of eSPACE and interest in ‘scholarly making’ and makerspaces?
  • There is potential to create an experiential media research cluster with the new staff of SODA, to explore immersive and interactive media that can capture emotions and affects of participants or players. This requires suitable equipment.

Review of Critical Gaming: Interactive History and Virtual Heritage

Internet Archaeology (@IntarchEditor)
16/02/2016, 7:52 PM
NEW! Review of Critical Gaming: Interactive History and Virtual Heritage dx.doi.org/10.11141/ia.40… @nzerik pic.twitter.com/TMsT7pHRx1

I have to say I found this a fair and interesting book review, my book was intended more as a primer for ideas for others to both reflect on and design (as well as evaluate) virtual heritage and interactive history projects but the change in jobs (and countries) chapter structure and word parameters resulted in some chapters to be less in-depth than the topics deserved. And as I noted on Twitter there is at least one (and probably several) reasons for the apparently too-dominant focus on built heritage! So sorry archaeologists but thanks to all for retweeting the review!

Ideas on how to adapt Kinect camera tracking for 3D presentations in archaeology

I did not mention all these in my 22 May presentation at Digital Heritage 3D conference in Aarhus (http://conferences.au.dk/digitalheritage/)

But here are some working notes for future development:

How Xbox Kinect camera tracking could change the simulated avatar:

  1. Avatars in the simulated world change their size clothing or inventories – they scale relative to typical sizes and shapes of the typical inhabitants, or scale is dependent on the scene or avatar character chosen.
  2. Avatars change to reflect people picking up things.
  3. Avatars role-play – different avatars see different things in the digital world.
  4. Narrator gestures affect the attention or behavior of the avatar.

How Xbox Kinect camera tracking could change the simulated world or digital objects in that world:

  1. Multiple players are needed to lift and examine objects.
  2. Objects move depending on the biofeedback of the audience or the presenter.
  3. Interfaces for Skype and Google hangout – remote audiences can select part of the screen and filter scenes or wire-frame the main model.
  4. Levels of authenticity and time layers can be controlled or are passively / indirectly affected by narrator motion or audience motion / volume / infrared output.

Kinect SDK 2 FINGER TRACKING (etc) for Desktops & Large Screens (VR)

We are trying to create some applications/extensions that allow people to interact naturally with 3D built environments on a desktop by pointing at or walking up to objects in the digital environment:

or a large surround screen (figure below is of the Curtin HIVE):

using a Kinect (SDK 1 or 2) for tracking. Ideally we will be able to:

  1. Green screen narrator into a 3D environment (background removal).
  2. Control an avatar in the virtual environment using speaker’s gestures.
  3. Trigger slides and movies inside a UNITY environment via speaker finger-pointing Ideally the speaker could also change the chronology of built scene with gestures (or voice), could alter components or aspects of buildings, move or replace parts or components of the environment. Possibly also use Leap SDK (improved).
  4. Better employ the curved screen so that participants can communicate with each other.

We can have a virtual/tracked hand point to objects creating an interactive slide presentation to the side of the Unity environment. As objects are pointed at information appears in a camera window/pane next to the 3D digital environment, or, these info windows are triggered on approach.

A commercial solution to Kinect tracking for use inside Unity environments is http://zigfu.com/ but they only appear to be working with SDK 1. Which is a bit of a problem, to rephrase:

Problem: All solutions seem to be Kinect SDK 1 and SDK 2 only appears to work on Windows 8. We use Windows 7 and Mac OS X (10.10.1).

So if anyone can help me please reply/email or comment on this post.

And for those doing similar things, here are some links I found on creating Kinect-tracked environments:

KINECT SDK 1
Kinect with MS-SDK is a set of Kinect examples, utilizing three major scripts and test models. It demonstrates how to use Kinect-controlled avatars or Kinect-detected gestures in your own Unity projects. This asset uses the Kinect SDK/Runtime provided by Microsoft. URL: http://rfilkov.com/2013/12/16/kinect-with-ms-sdk/
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my friend Jonathan O’Duffy from HitLab Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-packages.

NB
KinectExtras for Kinect v2 is part of the “Kinect v2 with MS-SDK“. This package here and “Kinect with MS-SDK” are for Kinect v1 only.

BACKGROUND REMOVAL (leaves just player)
rfilkov.wordpress.com/2013/12/17/kinectextras-with-mssdk/

FINGER TRACKING (Not good on current Kinect for various reasons)

  1. http://www.ar-tracking.com/products/interaction-devices/fingertracking/
  2. Not sure if SDK 1 but FingerTracker is a Processing library that does real-time finger-tracking from depth images: http://makematics.com/code/FingerTracker/
  3. Finger tracking for interaction in augmented environments: Finger tracking for interaction in augmented environments OR https://www.ims.tuwien.ac.at/publications/tr-1882-00e.pdf by K Dorfmüller-Ulhaas – a finger tracker that allows gestural interaction and is sim- ple, cheap, fast … is based on a marked glove, a stereoscopic tracking system and a kinematic 3-d …
  4. Video of “Finger tracking with Kinect SDK” see https://www.youtube.com/watch?v=rrUW-Z3fHkk
  5. Finger tracking using Java http://www.java2s.com/Open-Source/CSharp_Free_Code/Xbox/Download_Finger_Tracking_with_Kinect_SDK_for_XBOX.htm
  6. Microsoft can do it: http://www.engadget.com/2014/10/08/kinect-for-windows-finger-tracking/ Might need to contact them though for info

HAND TRACKING FOR USE WITH AN OCULUS RIFT
http://nimblevr.com/ For use with rift
Download nimble VR http://nimblevr.com/download.html Win 8 required but has mac binaries

February 2014 talks in California

Update: http://events.berkeley.edu/index.php/calendar/sn/arf.html?event_ID=74777&date=2014-02-11

Schedule of my visit to the San Francisco Bay Area:

1) Monday 10 February, 2014. 4pm-6.00pm. Kroeber Hall, Gifford Room.

Title: What is Virtual Heritage?

Virtual heritage could be viewed as a hybrid marriage of Virtual Reality and cultural heritage. Stone and Ojika (2000) defined it as

“[It is]…the use of computer-based interactive technologies to record, preserve, or recreate artifacts, sites and actors of historic, artistic, religious, and cultural significance and to deliver the results openly to a global audience in such a way as to provide formative educational experiences through electronic manipulations of time and space.”

The above is an interesting definition but I wish to modify it slightly, for it does not explicitly cover the preservation, communication and dissemination of beliefs, rituals, and other cultural behaviours and activities. We also need to consider authenticity of reproduction, scholastic rigor, and sensitivity to the needs of both audience and to the needs of the shareholders of the original and remaining content. No doubt this is due to the many issues in the presentation of culture. One is the definition of culture itself, the second issue is to understand how culture is transmitted, and the third is how to transmit the local situated cultural knowledge to people from another culture. In the case of virtual heritage, a fourth also arises, exactly how could this specific cultural knowledge be transmitted digitally?

Although I personally believe that fundamental issues of culture, place and inhabitation are still to be successfully addressed (Champion, 2014); computer games offer interesting opportunities to the audience, designer, and critic. They are no longer single player, shallow interfaces. They are turning into multivalent, multi-dimensional, user-directed collaborative virtual worlds. Commercial games are often bundled with world creation technology and network capability that is threatening to overtake the creation and presentation displays of expensive and complex specialist VR systems. In this talk I will not suggest that computer games are revolutionary, only that they are potentially changing the way we think, act, communicate, and feel.

 References

Stone, Robert J., & Takeo, Ojika. (2000). Virtual Heritage: What Next? . Multimedia, IEEE, 7(2), 73–74.

Champion, E. (2014). History and Heritage in Virtual Worlds. In M. Grimshaw (Ed.), The Oxford Handbook of Virtuality. Oxford: Oxford University Press.

2) Tuesday 11 February 10.30-12.30pm. 2224 Piedmont Avenue, MACTiA lab (room 12)

Informal Workshop/Brainstorm/Discussion with Dr. Erik Champion: Games – serious or otherwise – for and about archaeology and cultural heritage

Please feel free to drop in to this workshop and brainstorming session where archaeologists with Erik Champion will work through some ideas and plans for the design of computer games that are based in data of archaeological research and cultural heritage management and the interpretations of the past.

Starting point: Champion, Erik (2011) Playing with the Past. Springer, London.

3)Wednesday 12 February, 2014. 12noon-1pm. Archaeological Research Facility Lunchtime series: 2251 Building, Room 101.

Title: Heritage Via Games and Game Mods

In this informal talk, I will discuss classroom experiences (both good and bad) gleaned from teaching game design, especially work by students to develop serious games using historical events or mythological happenings.

My central argument is that despite apparent initial barriers, both students and teachers (and academics in general) can learn from the actual process of game design, and from watching people play. Theorists learn about the entangled issues of game design, the politics of user testing, and the designer fallacy (I designed the game, I know how best to experience it, if the audience can’t work it out there is something wrong with them, not the design). Students, in turn, can begin to understand (perhaps) how theory, good theory, can help open eyes, inspire new design and turn description into prescription. There are of course even more dilemmas and difficulties for visualizing and interacting with history and with heritage, and with moving from easily accessible commercial games and open source games, to larger Virtual Reality centres, planetariums and museums, but it has been done, with some significant successes.

This talk will touch on and move past projects mentioned in the following and free to download book: Champion, Erik (Ed.). (2012). Game Mods: Design, Theory and Criticism. Pittsburgh: ETC Press. URL: http://press.etc.cmu.edu/content/game-mods

Off-campus

4) Thursday 13 February, 2014. 12noon. Modeling, Virtual Environments and Simulation (MOVES) Institute, Naval Postgraduate School, Monterey.

TITLE:  Cultural Heritage and Surround Displays, VR and Games for the Humanities OR Immersive Digital Humanities: When The Motion Tracker is Mightier Than The Pen

How are scholars using surround displays, stereographics, gaming technologies and new peripherals to disseminate new ways of viewing, interacting with, and understanding humanities content, and in particular, cultural heritage? Which issues in cultural heritage and interacting with historical content need to be kept in mind by VR experts when working with humanities scholars? And are there key concepts and research developments in the VR field that humanities scholars should be more aware of? Or are the fields of interaction design and (digital) humanities converging?

NB Public talk but guests have to be pre-approved as it is at the Naval Postgraduate School.

Which hardware and software packages are vital for virtual reality and game display centres?

We have stereo and surround displays being built here at Curtin with typical Unity, AutoDesk and Adobe products.
But I feel we are missing a range of peripherals. So I made a quick list (I cannot find the Sony VR bike, would add that).
Which reminds me of PaperDude VR: http://techland.time.com/2013/08/02/paperdude-vr-paperboy-meets-virtual-reality-helmet-meets-motion-sensor-meets-connected-bike/

Anyway, the bike is a great natural interface for VR, especially for virtual simulations of large cites.

Suggested hardware
Virtual Reality bike interface http://www.computrainer.com.au/Buyonline.aspx

Biofeedback
http://emotiv.com/ especially the EEF head set http://emotiv.com/eeg/features.php
An alternative headgear set would be http://www.neurosky.com/Developer.aspx

3D
3D printer, possibly http://www.stratasys.com/3d-printers/design-series

Other Peripherals
In the past I mentioned siftables https://www.sifteo.com/ the product seems a shadow of their potential, wonder what happened.
This talk explains them here http://www.ted.com/talks/david_merrill_demos_siftables_the_smart_blocks.html
Only from USA stores I think http://www.marblesthebrainstore.com/locations

Nice to have: Arduino for prototyping simple peripherals http://techcrunch.com/2013/09/05/bitalino/

A drone http://ardrone2.parrot.com/ even archaeologists use them

Haptics
http://www.immersion.com/markets/gaming/
Probably the http://www.immersion.com/markets/gaming/products/index.html#tab=logitech if we are going to do urban vis in the dome
There is even a fishing pole with feedback! http://www.immersion.com/markets/gaming/products/index.html#tab=griffin (no have no use for this, don’t buy it!)
http://tngames.com/ 3rd space vest http://tngames.com/products
Kickstarter vest http://games.on.net/2013/06/araig-is-a-force-feedback-suit-for-gaming-and-they-want-your-kickstarter-dollars/
Or joystick http://www.thrustmaster.com/products/force-feedback-joystick

An excellent camera (DSLR) or even panorama camera, I know iVEC has them at UWA but I don’t think Curtin does?
http://www.ptgrey.com/PRODUCTS/ladybug2/ladybug2_360_video_camera.asp
I am not sure if we need a gigapixel camera or will borrow from iVEC@UWA

Software
For urban vis http://www.esri.com/software/cityengine
(Warning Sambit thinks it is clunky but I know of no decent competitors)
PS Wesley might find some good google earth data here https://earthengine.google.org/#intro

cycle trainer
http://www.tacx.com/en/products/software
review of above http://djconnel.blogspot.com.au/2012/10/interbike-2012-virtual-reality-trainers.html

motion capture http://organicmotion.com/products/openstage

3D modelling http://pixologic.com/zbrush/ esp http://store.pixologic.com/

3D modelling for landscapes http://www.e-onsoftware.com/
3D extras (software etc) for Unity https://www.assetstore.unity3d.com/

Adobe after effects, I am not sure Curtin has a license for this but it is great for video editing.

Panorama stitching software eg www.autopano.net or www.easypano.com/virtual-tour-studio.html or any of
www.ptgui.com/
software.bergmark.com/enfuseGUI/Main.html
gardengnomesoftware.com/pano2vr.php
krpano.com/
flashificator.com/

Visiting Fellows to work with me at Curtin University in Visualisation, 2013

I am very happy to announce that two Visting Fellows and two Early Career Visiting Fellows will work with me in October and November on various projects.

They are (and please note, dates are provisional):

Visiting Fellows

 

Nov 4-27: Dr Jeffrey Jacobson, http://www.publicvr.org

To provide examples of interactive and immersive environments featuring architecture and archaeology of the ancient world, to run inside Curtin’s new visualisation facility, iDome, Stereo Wall, and/or possibly the Wedge. Upload and run public VR 3D models inside UNITY on the iDome. These are the Virtual Egyptian Temple, Living Forest, Theater District of Pompeii. Prototype ancient heritage sites to run on the 0.5 CAVE (actually it is a Wedge). Design and pilot evaluation environment for potential use in humanities subjects, including history, and the visualisation undergraduate degree.

Nov 16-Dec 16: Dr Rob H. Warren, Canada, http://blog.muninn-project.org
Link 3D models in virtual environments (Unity real-time engine) to the archival databases to create a specific pilot of a World War 1 simulation using accurate historic geo-data, weather data, astronomical data, and historical records. Design and pilot evaluation environment for potential use in humanities subjects, including history, and the visualisation undergraduate degree. Link to colleagues in New Zealand and Canada to discuss potential research collaborations

Early Career Visiting Fellows

Nov 4-11: Andrew Dekker, University of Queensland http://itee.uq.edu.au/~dekker/ OR http://uq.academia.edu/AndrewDekker

We will work together on the following project: Camera tracking and biofeedback for indirect interaction with virtual environments. This project will connect biofeedback devices and camera tracking devices with equipment in the Curtin Data Visualisation Facility (CDVF) and provide a research platform to evaluate how biofeedback can be a meaningful interaction component for virtual environments, especially for augmenting socially believable agents, and to enrich the apparent “life” and “atmosphere” of digitally created architectural environments.

Nov 18-25: Dr Hafizur Rahman, Bangladesh http://bdheritage.info and http://ttclc.net

Create a streamlined 3D model data and 3D virtual environment workflow, analyse and comparing different image modelling tools, and explain how their optimal deployment for community web portals of digitalised cultural heritage.

Acquiring 3D models for artifacts is always expensive, as it typically requires a 3D laser scanner and relevant training. However, 3D modeling of small artifacts is possible to produce with photographs using low cost software such as 3D Som Pro (http://www.3dsom.com/). This software can produce 3D wire mesh and baked images for rendering, which can later be use as a source for augmented reality application for interactive public display. Free AR Toolkit /BuildAR can be used here for making this interactive display for museums/heritage institutes and interested community groups who currently lack high end technological resources and related skills.

We will also compare the above to insight 3D (http://insight3d.sourceforge.net/), which is free and open source. We will produce schematic workflows, incorporating Blender 3D for modeling and we will consider alternatives such as Google SketchUp.

fascinating biofeedback equipment-BITalino kit

The low cost (€149/$197 + shipping and taxes) kit of modular blocks includes a swathe of physiological sensors that can be broken out to use individually or linked together and used in whatever combination you’re after. BITalino’s approach is plug and play, to keep things as simple as possible. The sensors in the kit can interface with computing platforms such as Arduino (and derivatives) and Raspberry Pi, says project lead Hugo Silva. BITalino also includes Bluetooth connectivity so can be used in desktop and mobile environments.

“Currently there are several APIs for platforms including Android OS, Java or Python; BITalino is also cloud / web compatible through a software framework based on WebSockets, HTML5 and CSS3,” he tells TechCrunch

http://techcrunch.com/2013/09/05/bitalino/

Sensors included in the BITalino kit are:

  • an EMG (electromyography) to track muscle activation
  • an EDA (electrodermal Activity) to measure skin activity/moisture levels
  • a LUX light sensor to monitor ambient light or (used in conjunction with a light source) to track blood volume pulse data
  • an ECG (electrocardiogram) to track heart rate, monitor stress etc
  • an accelerometer to track limb movements

The board also includes an LED block for visual feedback, a microcontroller unit and a power management block to power the other units.

Youtube video: