Categories
Experiences

Rembrandt Reality

Use your smartphone to travel to the year 1632 and step into Rembrandt’s painting ‘The Anatomy Lesson of Dr. Nicolaes Tulp’.

Place the gate and walk round in the Anatomical Theatre.

See through Rembrandt’s eyes how Doctor Tulp and his fellow doctors are examining the body of the criminal Aris Kindt. Discover all the stories behind the painting.

Developer description on Google play store

This amazing ARCore application for Android smartphones has been a pleasure of mine since discovering it over a year ago. It’s great to load it up every once in a while and enjoy the 6DoF environment in all its glory (it has a huge playspace).

Scanning the floor to create a ground plane

Best used outdoors in a quiet place, with good quality audio headphones. I use my Pixel 3aXL which has a good quality display, and sennheiser HD 461 headphones which provide some isolation from background noise. The application can be used with your phone display in portrait or landscape with auto rotation.

Placing the gate on the ground plane

The application boots up quickly and takes you through an introduction to the scenario and how to interact.

You then scan the floor to create a ground plane for the application to sit on, once ARCore has worked its magic a “gate” appears which can be placed precisely by tapping on your screen.

Initial appearance of the gate

Once the gate has been placed it materializes first as an archway with a stone finish, before opening the “gateway” into the Rembrandt environment.

An open invitation… what lies beyond?

You are then invited to walk through the gate, and this is where a genuine sense of physical space is generated from physically walking forward into a rich black backdrop, with the scene itself set further back from the entrance gate.

The anatomy lesson in progress

In front is the anatomy lesson of Dr. Tulp, and the surgeons keenly watching the dissection. I am able to walk forward another 5 metres before reaching the centre, the sense of scale is very impressive as it’s using 1:1 mapping.

With ARCore providing a solid geospatial anchor, I can freely walk around inside the environment, and get as close as I want with high quality assets showing rich detail in 6DoF.

The sense of presence is rewarding ‘despite’ this being presented on just a smartphone display rather than inside AR glasses.

During the experience, ‘hotspots’ can be clicked on which provide very useful insights into the original Rembrandt painting using images and audio description.

Up close and personal

Once you have selected the different hotspots you feel well informed, yet the real treat for me is always looking around the environment, with the ceiling a particular highlight – this is cleverly mentioned during one of the hotspot activated informationals.

Amazing ceiling with animated bird flying about

My other favourite element of this AR experience is exploring the boundaries of the environment and looking back through the entrance gate (to the “real world”) which causes a strong impression that you are inside the environment of the painting!

The participants appear to be taking note of the gate….

I haven’t calculated the available space inside the environment but it’s very large, I always finish by walking back through the gate.

The persistent nature of the application means that the gate straddling the boundary between the environment and real world can be carefully inspected, walked through, back through, the students and doctor remaining in place, a great example of a “Portal mechanic” in action.

Mind is blown by seeing the real world back through the gate! It’s fun to stand to one side and look around the thick edge of the gate out into the “real” street outside.

It’s always a pleasure to use Rembrandt Reality, the developers did a great job building this using ARCore. High quality experiences like Rembrandt Reality demonstrate the potential of augmented reality even on smartphones (I’d like AR glasses,but 2030?)

Rembrandt Reality is available as a free download on both the Google play store and Apple store (there is an ARkit build for Apple devices).

Thanks for reading! Rob Cole.

Categories
Experiences

Microsoft Hololens

I’ve been fortunate enough to have had several sessions with Microsoft’s Hololens AR standalone headset; it’s always been impressive to use despite the obvious limitations of current AR technology.

Talking of technology, Microsoft list the Hololens with these specifications:

Optics See-through holographic lenses (waveguides)

Holographic resolution 2 HD 16:9 light engines producing 2.3M total light points

Holographic density >2.5k radiants (light points per radian)

-Eye-based rendering

-Automatic pupillary distance calibration

In addition, the Hololens has a fully loaded sensor array:


    1 inertial measurement unit (IMU)
    4 environment understanding cameras
    1 depth camera
    1 2MP photo / HD video camera
    Mixed reality capture
    4 microphones
    1 ambient light sensor

Compute

  • Intel 32-bit architecture with TPM 2.0 support
  • Custom-built Microsoft Holographic Processing Unit (HPU 1.0)
  • 64 GB Flash
  • 2 GB RAM

Regarding pricing, I’d only heard of them being sold to enterprise and big business (i.e. Microsoft partners) but I once saw a Hololens for sale in computer exchange (CEX) for a cool £3,200.

From my somewhat limited understanding of augmented reality technology, there is a long roadmap of development still ahead.

VR is almost seen as a solved problem with further iterations only set to improve on what is already a very immersive experience in terms of ‘presence’ (feeling of being there). Wider field of view, varifocal, eyetracking, HDR, etc. These features will be introduced to consumer headsets as costs are reduced.

VR experiences are very effective even with current consumer level technologies.

But AR has a much harder set of technical challenges and problems to solve before we find ourselves wearing the “AR glasses” seen in a number of films and television shows over many years. 2 great examples of AR glasses and contact lenses in media are Hulu’s Mars mission television show “The First” (Sean Penn), and Clive Owen’s recent film “Anon”.

AR overlay in movie “Anon” with Clive Owen and Amenda Seyfried

Facebook Reality Labs, Apple and Microsoft are amongst those companies employing lots of very smart people to try and figure it out as the race to replace the smartphone with AR glasses is underway. Of course Microsoft had their kinect sensor technology from the gaming console business, which was further developed for Hololens.

Welcome to the future….

Microsoft’s Hololens AR standalone headset has available since October 2016 in the UK, with a new version shipping right now. Being a special order device aimed at enterprise customers, it’s been difficult to get any hands-on, until Microsoft did a launch party for their new London experience store.

And of course I went back several times in the following weeks to use it again, including a quiet morning where I had a full hour using the Hololens 😘

Interesting form factor and ergonomics:- rotate the headband, push it back, adjust the wheel on the rear of the headband.
Sensors galore and awesome looking waveguides

The device was reasonably light (reported at 579 grammes) and comfortable to wear with easy adjustment system using an rotating headband which is pushed back to fit, and then a simple adjustment wheel on the rear of the headband to change the circumference.

The holographic display was surprisingly impressive with the limited field of view not as severe as I had been led to believe. Yes it was limited especially compared to my VR headsets, but after all…it was using holograms 🤯

Holographic resolution and brightness were sufficient to create a convincing illusion, it was better than I had expected from reading many reviews prior to trying it myself.

I first did an experience focused on the current London location but with an AR overlay showing a historical scene with horse and carriages rolling past outside, which felt really magical.

Then I used several applications which were already onboard, with one showing how to use hand gestures; it was here the limitations of hand tracking were evident with it sometimes requiring several gestures actions to trigger. Despite that, it was great fun when it worked with the freedom of hands free computing.

However the lighting conditions were not optimum with lots of sunlight and people moving about,vso it would need testing in another location to determine the reliability of the gesture recognition.

Microsoft list the device capabilities as follows:-

Using the following to understand user actions:

    Gaze tracking
    Gesture input
    Voice support

Using the following to understand the environment:

    Spatial sound

Having an amazing time playing with Hololens

Overall I found Hololens to an impressive demonstration clearly signalling the huge potential for AR glasses.

Most importantly, it passed the “WOW!!” test, which is the potential of any HMD to make you pull the wow! face. This is clearly seen in the image below, wow!

Having now used Hololens several times, I’m really looking forward to trying it’s successor the Hololens 2.

I’m also very interested in following the development of augmented reality glasses as the successor to the smartphone. Google glass, Microsoft, Magic Leap, Apple, and many more to follow…

Making the transition to a “head up, hands free” computing platform has substantial benefits for skeletal posture, reduction of repetitive strain injuries, increased spatial and environmental awareness, and hand freedom to interact with the computing interface and the real world.

My experiences with the Hololens and Magic Leap has firmly convinced me of AR’s potential to change our world.

However, these 2 devices remind me of early VR headsets from the 1990’s, where potential was clear to see despite the technology being immature.

I don’t expect to see really competent AR glasses until the early 2030’s, but do look forward to trying further developments as AR technology continues to improve.

Big thanks to the people at Microsoft London for letting me use the Hololens. And thanks to you for reading! Rob Cole

Categories
Experiences

Immersivecomputing @ Raindance Immersive Festival 2019

Following my first visit to Raindance in 2018, I was keen to go again and waited patiently for the organisers to open up their website ticketing for the 2019 event.

It was tricky working out which experiences to visit during each timeslot so I tried different combinations until I had a full day of experiences booked for the final Sunday.

The venue was the same at The Oxo Tower in London, it felt comfortable to be back at a familiar venue as I walked up the stairs.

After signing in I received an armful of coloured wristbands and a programme of events with a floor layout showing where each of the experiences were located. The event was starting to fill up with guests eager to get into VR.

HTC Vive Pro Eye

I started off with “Gloomy Eyes” on the HTC Vive Pro Eye headset, my first time seeing one or using one since HTC launched the new model.

With the eye tracking inoperative it was the same Vive Pro I’m familiar with, generally a decent headset with a robustness ideal for public or enterprise use where they don’t always get well treated!

The experience itself was very…gloomy…but very cinematic and awe inspiring. Hopefully it will get a home release very soon!

I was getting some light leakage at certain angles due to the large windows facing the River Thames, so tried to find a better pose in which to enjoy the experience. I was glad to be using an OLED display headset as it was very dark inside Gloomy Eyes.

Eye tracking module inside Vive Pro Eye

After Gloomy Eyes, I tried more experiences, all using Vive Pro Eye.. they were everywhere at the festival alongside some older headsets.

Doctor Who: The Edge of time was fun though I managed to get snagged in some virtual scenery which prevented me completing the demo.

No man’s Sky VR was my next experience, I failed to get out of the crater before being irritated as I grappled with the Vive wands. 🥴

My final experience from my first session was “The curious case of the stolen pets” which was on the new Oculus Rift S; a headset I’d not been available to demo anywhere in London.

Oculus Rift S in all its glory!

This was a great opportunity to see how different it was from the Rift CV1 which I’d owned in the past, it was reportedly using the single display panel (overclocked) and new Fresnel hybrid lenses from the Oculus Go.

The fixed IPD and lower refresh rate (80hz) were something that had concerned me, though my IPD at 63.5 was right in the sweet spot with the lenses proving Oculus’s prowess with lens design (often requiring a lot of ray simulation involving a supercomputer).

Pet rescue on the Rift S

I was initially impressed with the Rift S, the display was clear with minimal screen door and the lenses were clear of artifacts. The experience was very fun with a large puzzle I spun around as I tried to rescue the pets, I solved the first two before running out of time.

However after months of using the Valve Index, the lower 80hz frame rate on the Rift S was very noticeable and felt sluggish in VR.

Great lenses in the Rift S

It didn’t give me a great feeling of presence, partly the lower frame rate and partly the smaller FOV, which felt like scuba goggles again after Index and actually seemed slightly smaller than CV1.

Touch insight (left), Touch CV1 (right)

The Touch insight motion controllers worked well, but felt a bit creaky and unbalanced compared to the sublime Touch CV1 controllers which perhaps shall remain the gold standard.

Oculus Touch insight controller

Overall the Rift S felt like a sidegrade in some ways with steps forward and backwards, certainly very good value at £399 although the frame rate remains a concern – 90Hz should be the minimum for PCVR with Index proving 120Hz the new standard.

As my first session finished I felt very satisfied having tried the Vive Pro Eye and Rift S, but these were quickly forgotten when it dawned on me that my next session was using something even more rare, that I didn’t think would be available to demo.

What could that be??

“Rise of the Animals with David Attenborough” immersive AR experience on the Magic Leap was awaiting, of course I needed to have a really good look at the hardware 🤯

Sensor array on Magic Leap
The ML goggles
See the waveguides?
Compute unit on belt
Remote

After looking very thoroughly at the entire kit, it was time for my first Magic Leap AR experience.

The lady running the experience helped me fit the Magic Leap on my head, I hung the compute unit belt on my shoulder but didn’t need the remote as the hand tracking was being used.

Making gestures using hand tracking to start the experience

The experience was…. very impressive. I’d read so much negatively and poor reviews about Magic Leap I wasn’t sure how it would work and how effective it would be?

I’m not going to spoil the experience itself with any spoilers; the field of view was limited (as many had mentioned in reviews) but serviceable, the 2 depth focus planes were very welcome after several years of fixed focus VR headsets.

I had dinosaurs crawling about everywhere in a huge space so I went wandering about following the creatures…much to the amusement of the other guests!

Overall I was impressed with Magic Leap which gave a really good impression of how powerful AR can be once the technology develops.

It certainly got me moving about and using the entire space, the image quality was good and effective. Environmental tracking and hand tracking also impressed. I’ve used Hololens in the past, recently only a week before Raindance, and thought Magic Leap was a superior device in many ways.

After Magic Leap it was time for lunch, before another round of experiences. I spent my lunch talking XR with developers, volunteers and people from different companies and university’s, including some people from Bose who were showing their new AR sound glasses.

These were interesting to try and quite effective, I tried a couple of different audio experiences. The glasses provided good audio presence but were a bit overpowered by background noise.

After Bose, I had some fun using the Oculus Go which has always been good for shorter sessions as it’s a bit front heavy like the Quest.

Many Oculus Go charging alongside a Vive Focus

Oculus Go experiences included Anonymous, Playing God and Afterlife. This was pretty creepy showing the aftermath of a child’s death and it’s impact on remaining family members specifically the mother.

Go is still a great device for media consumption and 3DoF experiences especially when paired with good audio headphones. Its ideal for festivals being inexpensive and easy to deploy in volume.

Probably the strangest thing about going to immersive festivals is lifting off a headset to realise you are in a room full of people wearing headsets! It’s a bit odd to witness, and I had the thought that many headset owners would appreciate the opportunity to try the experiences and demos at home.

I spoke to the organisers and proposed an idea for the next festival; offer ticketed access so headset owners could participate online during the festival with time limited access to the different experiences.

Perhaps the last time I’ll see Oculus’s Rift CV1 at a festival? This solitary headset did it usual stellar duty!

I thoroughly enjoyed my time at Raindance Immersive Festival 2019, huge thanks to the organisers, the developers and hard working volunteers assisting the guests.

Thanks for reading! Rob Cole, immersivecomputing