This edition first published 2021.
© 2021 by Bernard Marr
Registered office
John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, United Kingdom
For details of our global editorial offices, for customer services and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher.
Wiley publishes in a variety of print and electronic formats and by print-on-demand. Some material included with standard print versions of this book may not be included in e-books or in print-on-demand. If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com. For more information about Wiley products, visit www.wiley.com.
Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. It is sold on the understanding that the publisher is not engaged in rendering professional services and neither the publisher nor the author shall be liable for damages arising herefrom. If professional advice or other expert assistance is required, the services of a competent professional should be sought.
Library of Congress Cataloging-in-Publication Data is Available:
ISBN 9781119695172 (hardback)
ISBN 9781119699385 (ebk)
ISBN 9781119699378 (epub)
Cover Design: Wiley
Cover Image: © Anna_leni/Shutterstock
To my wife Claire, and my children, Sophia, James, and Oliver; and everyone who will use the amazing XR technologies to make our world a better place.
I've always wondered whether other people see the world in the same way as me. And I mean that literally, not figuratively. Do other people see the color green in the same way as I see it, for example? Do I see things exactly the same as everyone else, or am I experiencing something unique to me? After all, what is reality, anyway? Isn't “reality” different for all of us?
I may never know for sure whether I see the color green in exactly the same way as others. But what I can do – what we can all increasingly do – is embrace this notion of a reality that's unique to me. This is possible thanks to extended reality (or XR for short).
XR blurs the boundaries between the real world and the digital world, meaning it can be used to create more personalized, unique experiences. For now, this is mostly used to create immersive experiences in marketing, education, tourism and the like. But in the future, it could extend to all aspects of life as we know it – to the point where each one of us could potentially transform the real world around us into something personalized, using special glasses, headsets, or maybe even contact lenses and implants. Let's say you hate the garish paint job your neighbors have done on the exterior of their home. In the future, your glasses could change it for you, and you'll see whatever color house you choose. Or let's say you see an impressive building and want to know who designed it and when it was built. Your glasses will be able to tell you, overlaying the info directly in front of your eyes (or you'll be able to point your phone camera at the building and see the relevant info onscreen).
Increasingly, our experience of the world will take place in this blurred area between the real world and the digital one. If you think of the time people spend on social media, crafting their online persona, it's clear the line between the digital world and the real one has already become pretty porous. XR will accelerate this. If that sounds a little ominous, it's not. I believe XR is going to change our world and transform our businesses for the better. As the examples in this book show, it's already happening.
To be clear, this isn't a tech book. It's not about how to build XR experiences. It's about real-world applications, and the incredible possibilities of XR, now and in the future. It looks at how XR is already being used in practice, across a range of different industries, and what these state-of-the-art applications might mean for the future. I've therefore written the book with business leaders in mind, but hope that anyone interested in this huge tech trend will find inspiring food for thought in these pages.
As a futurist, it's my job to look ahead, identify transformative tech trends and tell people about those trends as they begin to burst into the mainstream. It's something I've done before with key trends like artificial intelligence (AI) and big data. Given that XR is predicted to become a $209 billion market by 2022,i I'd earmarked it as another burgeoning trend to watch closely.
That's to say I planned this book before the coronavirus crisis hit, and started writing while under lockdown in the UK. During lockdown, it became even more obvious that XR is a tech trend rapidly on the rise – and that the technology will now be fast-tracked by many companies.
What was already a trend before COVID-19 quickly became a way of life for many, giving businesses a vital way to maintain connections between people, from the comfort (and safety) of their homes. Pretty much overnight, people who had previously gone to work in an office were conducting daily video calls from home (with increasingly impressive virtual backgrounds), and new tools surfaced that simulate the experience of working in an office environment. Argodesign's artificial window concept is just one example. It's an LCD screen that goes on the wall and looks like a window with the shade pulled down – but if you pull up the shade, you see a colleague (or colleagues) through the “window.” You can even chit-chat and make awkward eye contact, just like in a real office.
Virtual conferences are another good example. As traveling to in-person conferences was suddenly no longer an option, virtual conference experiences – like those provided by VirBELA – stepped in to bridge the gap with immersive online conferences, right down to the breakout sessions.
Many experts, myself included, believe coronavirus will change the very nature of work, tipping the balance in favor of more remote working. Which means our lives will become ever more digital, and those digital experiences will need to become even more realistic. Interactions between the real world and the digital world will become all the more seamless. The boundaries between the real and the virtual will further blur.
In the future, then, we'll be able to have our business meetings and team-building sessions in whatever virtual settings we want – around a campfire in the middle of a gorgeous wildlife resort, in a futuristic office, on a beach, or even on the Moon. Why not? XR makes anything possible. And we won't even need to leave our homes to do it. You could prepare for that big presentation in front of a virtual audience before you present it in the real world. And after that big presentation, the team could let off steam by going to a (virtual) Rolling Stones gig, or watching a Manchester United or Dallas Cowboys game from your (virtual) corporate VIP box.
The pandemic also gave us a taste of how XR will alter the customer experience. Unable to connect with customers in the real world, lockdown presented many businesses with a stark choice: adapt or die. Again, XR provided a way to maintain those connections with customers and give them a unique, memorable experience. One great example comes from Barcelona-based bridal company, Pronovias Group, which launched a virtual showroom and virtual appointments, allowing customers to shop the latest bridal collections at home. Going forward, XR could deliver many more opportunities to immerse customers in the brands they love, and support the in-person customer experience.
There's another reason this book is so timely: we're entering a new industrial revolution – the fourth industrial revolution – where innovation is being driven, in particular, by AI and big data. These technologies feed into and enhance XR technologies, as do other tech trends like 5G, cloud computing and edge computing (processing data close to the source of where it is generated). This perfect storm of technology will aid the development of new XR solutions and make XR experiences even more powerful in the very near future.
I delve more into the technology itself in Part 1, but, for now, let's take a brief look at what XR means. XR is in fact an umbrella term for a range of immersive technologies, spanning the ones we already have today – virtual reality, augmented reality and mixed reality – plus those that are yet to be created. In terms of the current technology, we have:
Clearly, then, XR represents a spectrum, with some of the technologies being way more advanced and impressive than others. Some require specific hardware, while others harness the capabilities of the average smartphone. The interfaces are constantly evolving, and it's likely we'll experience XR in completely new ways in future. But, across the spectrum, all the different XR technologies have one thing in common: they enhance or extend the reality we experience, whether it's by blending the virtual and real worlds together or by creating a fully immersive digital experience that feels as authentic as the real world.
This ability to create more immersive digital experiences or enhance the experience of the real world around us is going to transform many businesses and industries. It will provide companies with new ways to connect and engage with their customers, and improve the customer journey. It will also bring exciting new opportunities to improve business processes, including training, education and hiring.
In short, XR will turn information into experiences. And this has the potential to change, well, pretty much everything.
In Part 2 of this book, we'll explore real-world use cases from the here and now – compelling examples of how the world's biggest brands are starting to use XR in practice. For example:
Having explored the current state-of-the-art in XR, in Part 3 of this book I'll take a look ahead and see where XR might be heading in the future.
In this chapter, we've learned:
I hope this introduction to XR has whet your appetite, and you're now keen to learn more about XR's capabilities. In the next chapter, we'll delve into XR technology in more detail.
Without getting too bogged down in technical details – after all, this isn't a tech book – it's worth spending some time exploring the different technologies that sit under the XR umbrella. Therefore, this chapter gives you a basic grounding in the XR spectrum, including how the various XR technologies work, and what they can do.
My goal in this book is to showcase the world of XR, and how XR technologies are changing our lives and our businesses. What I'm not trying to do is rigidly define each type of XR and draw distinct boundaries between the different technologies.
This is important because XR is still very much a developing field, and it's not always clear where one XR technology ends and another begins. For example, experts can get far too caught up in whether something should be classified as augmented reality (AR) or mixed reality (MR). To me, that just isn't useful, nor is it particularly relevant. At least, not from a business perspective. I imagine you, the reader, want to grasp the potential of XR and understand how it can improve certain elements of your business – and you don't much care where the boundary between AR and MR lies. I make the assumption that you're interested in uses, results and outcomes, as opposed to academic debate.
It's also worth noting that, just as the boundary between the real world and the digital world is becoming more blurred, so too are the boundaries between the different XR technologies. As XR advances, the various technologies that sit under the XR umbrella will become more and more linked, and users will be able to seamlessly move from one technology to another.
So, in the future you may use AR to bring information to life in the real world, then switch to VR to deepen that experience. Say, for example, you're taking a (real-life) holiday on a Greek island. Using AR, you could point your phone at some impressive marble columns and the information onscreen will tell you those columns once formed the entrance to a site where mysterious ancient rituals were performed. Flip on some VR goggles and you could then immediately step into this world and move among the people of Ancient Greece – no toga required! In the final chapter of this book, I talk more about the future of XR, but one of the key developments I expect to see is a more seamless blending of XR technologies.
What's more, this technology will evolve in ways we can't yet imagine. Remember the fairly brief but intense craze for all things 3D a few years back? 3D movies like Avatar and Gravity blurred the boundaries between the normal moviegoing experience and something altogether more immersive. Then people started buying 3D TVs for their own home, expecting the home viewing experience to move in a similarly immersive direction. But the concept didn't really take off as expected, and manufacturers quietly shelved their production of 3D TVs. Now, holographic displays are beginning to emerge that revive this notion of immersive home viewing and take it in a new direction. Holographic displays are being developed that can project 3D holograms from the screen, without the viewer having to don clunky glasses (a major downside of the previous 3D wave). This shows us how technology is constantly moving forward, toward a future in which everything in our lives becomes more immersive, more digital – but the specifics of how that technology works, what it's capable of, and even what it's called will change. The same sort of thing may happen within the XR spectrum; for example, it's possible that digital displays will be able to project virtual content onto the real world, without us needing special headsets or apps.
All this means precise definitions will likely become less useful as XR evolves and the boundaries between different technologies become more blurred. That's why we shouldn't get too bogged down in definitions of and differences between concepts like AR, VR and MR. What matters is how we can apply the technology in the real world.
That said, in the interest of breaking up the rest of this chapter into manageable chunks, I'll now attempt to create some loose distinctions between AR, VR and MR. Let's start with AR.
For me, AR has the biggest potential in the short term, because it doesn't have to involve a special kit like goggles or headsets. In many cases, a simple smartphone, laptop or tablet, something with a camera and digitally enabled screen, will do. (Saying that, there are specially designed AR glasses, like Google glass, which will crop up in examples throughout this book.)
Whether it's using specially designed glasses or a simple smartphone, AR involves the projection of digital elements – such as information, graphics, animation or images – onto the real world, so that the digital content being superimposed looks like it is part of the physical world. I've already mentioned Pokémon GO as one example of this technology in action; those Snapchat filters that overlay cute animal ears over your own are another basic example. There's also Google's SkyMap app, which tells you about the constellations as you point your smartphone camera at the sky. Or how about the IKEA Place app, which lets you digitally place IKEA's furniture in your room, so you can check out whether it fits (and how it looks in that space) before you buy.
Because the digital element is superimposed onto reality, the user is still very much in touch with the real world in front of them (unlike, say, a VR experience, where the world created around the user is entirely digital). Yet, thanks to the AR projection, the real world has become enhanced – more informative, more entertaining, or more interactive, for instance.
Head-up displays, which project information onto a windshield, are another interesting example of AR in action. The technology was initially developed for fighter jets, so that the pilot could keep looking ahead while accessing relevant info. Now, cars and trucks are beginning to use head-up displays as a safety feature, in order to help reduce driver distraction. These displays project real-time information such as GPS maps or vehicle information either directly onto the windshield itself (in cases where the technology is included in the vehicle as standard) or onto a film that's been added to the windshield (in cases where the technology has been retrofitted). Just as in those fighter jets, the idea is to keep the driver's eyes front and center, giving them the info they need at a glance, without hindering their view of the road ahead.
AR needs a live camera feed in order to add digital content on top of the real-world elements. The camera feed is what allows the AR system to understand the physical world, so that it can add the right digital content in the right place (a puppy nose over your real nose, for instance). This is all possible thanks to computer vision, also known as machine vision – essentially, a subset of artificial intelligence (AI) that helps machines “see” the world around them and respond accordingly.
Once it has the live, real-time camera feed (be it of a building, the street, your friend's face, or whatever), the AR system then renders digital content on top of the relevant real-life content, making sure it overlaps correctly and is located in the right place. This is updated in real time as the camera feed changes – say, as you're walking down the street holding up your phone.
VR offers a far more immersive experience than AR, but, in order to do that, it requires more technology and infrastructure (at the very least, a VR headset). The good news is that this kit is getting lighter, better and less cumbersome. We no longer need heavy headsets with lots of cables that connect to a computer. Now, we can have a lightweight, standalone headset or head-mounted display that doesn't need to be plugged into a main computer. The technology is getting cheaper, too – for just a few dollars, you can get a basic Google Cardboard VR viewer that, along with an accompanying app, transforms your smartphone into a VR device. Of course, for the best VR experience, you currently still need fairly elaborate gear, such as headsets, controllers and speakers. But there's no doubt that the technology is generally shrinking, and getting cheaper and simpler – all of which helps to make VR rapidly more accessible.
While AR is rooted in the real world, VR creates a 3D, 360-degree experience of an artificial, computer-simulated ecosystem. Strap on a VR headset and you're completely transported into this artificial world – whether it's being underwater and exploring a coral reef, walking on the Moon, visiting Ancient Egypt, or whatever. Meanwhile, the real world around you is totally blocked out. Such VR headsets include the Oculus Rift, HTC Vive, GearVR and the previously mentioned Google Cardboard (which is, you guessed it, made of cardboard). These vary in sophistication in terms of how slick and seamless the experience is.
The world of gaming was an early adopter of VR technology, and is perhaps still the first thing people think of when it comes to VR experiences. But, as you'll see in this book, many other industries are now beginning to harness the possibilities of creating fully immersive experiences for customers and colleagues alike.
One recent VR example is the Spatial app. This is a virtual meeting space that lets you meet up with colleagues or friends, whether or not you have a VR headset. If you don't have a headset, you can simply join using a web browser on your phone, tablet or computer. This is an important leap forward because it means people without a special VR kit can still join in the experience. Spatial is also free and open to everyone (a paid-for, enterprise version with enhanced features is also available).
With Spatial, you can meet with others in a beautiful virtual meeting space, and, thanks to virtual avatars – you can take a picture of your face to create your own personalized digital avatar – it feels like you're really in the room together. What's more, your avatar can move around the room and gesticulate as you talk. As you can probably imagine, this is a far cry from the average Zoom or Skype experience, where you're just looking at a wall of 2D faces. Spatial says it has experienced a huge surge in demand – approximately a 1,000 percent increase – in the wake of COVID-19.i I'm not surprised. Tools like this will revolutionize remote working.
(As an aside, the use of personalized avatars is particularly interesting to me, and something that we're likely to see a lot more of across various XR technologies. In the future, we could all have different avatars for different digital settings. For example, you could have a smartly dressed avatar for your virtual work meetings. You could have a completely different avatar [animal, human, whatever] for gaming and hanging out with friends online. And you could also have a very realistic avatar, one that accurately reflects your real-life size and shape, which you could use to virtually try on clothes before you buy.)
Vision is key to creating an immersive 3D environment, which is why special VR headsets are needed. Therefore, a VR headset is, in essence, a small screen (or it could be two screens, one for each eye). Sound effects are also key to creating a consistent, engaging experience, which is where speakers and headphones come into play. Then you have head- and eye-tracking technology to track the user's movements. This may use laser points and infrared LED lights within a headset, or sensors within a mobile phone – or, in very sophisticated systems, special cameras and sensors can be installed in the room to monitor movement.
I've already mentioned how the line between reality and the digital world is becoming increasingly blurred. MR – sometimes referred to as hybrid reality – plays on this notion and takes it to a new level by combining elements from VR and AR. MR is by far the least mature of the three XR technologies featured in this book. However, as we'll discover, companies are already beginning to use MR to solve their business challenges, support new initiatives and improve business processes.
There are lots of confusing definitions surrounding MR and, in particular, some debate over what constitutes MR versus AR. For me, the distinction is this: MR blends components of the digital world with the real world in real time, to the extent that you can interact with the digital elements as if they were real objects. This creates a more immersive experience than straightforward AR. For example, instead of seeing a projection of a digital object on top of the real world (as you would in AR), MR would let you move that digital object with your hands, turn it around to inspect it from different angles, make it bigger or smaller, and so on. With MR, you don't fully block out the real world, as you would in a VR experience. Rather, you're able to experience a virtual environment and the real world at the same time.
One example of MR in action comes from British company BAE Systems, which uses MR to enhance its production of electric bus batteries. Using Microsoft's HoloLens MR headset, BAE workers can project 3D images and instructions onto their workspace, and follow the digital instructions to construct the complex batteries. According to BAE, the use of MR has reduced the time it takes to build batteries by up to 40 percent.ii
MR requires a dedicated MR headset and a lot more processing power than VR or AR. It may also require the use of controllers and motion tracking technology, such as gloves that track your hand movements so you can interact with digital objects.
At the time of writing, the Microsoft HoloLens is the main MR headset on the market, and it comprises holographic lenses, a depth camera, a variety of sensors, plus speakers. With the HoloLens, you look through the headset and see your normal surroundings. But you'll also see holograms (for example, virtual beings, information or objects) overlayed on top of the real world – and, using hand controllers or specific gestures, you can play around with these holograms as if they were real. For instance, you might see a digital to-do list beamed onto your office wall and be able to wipe items off the list as you complete them.
As I've already mentioned, in the future I believe AR, VR and MR will all merge together to create more immersive user experiences, where you can move from one device to the next to deepen the experience. Where you can move from an experience that's more rooted in the real world to one that's fully digital. This blending of technology will eventually allow us to see the world however we fancy – to turn the real world around us into whatever we want. Pink trees instead of green. A cartoon avatar instead of your boss. A rainforest instead of a bland conference room …
And the technology itself will change. Right now, to get a fully immersive VR experience, you need special gloves or even full body suits to track your movements and simulate the feeling of touch. In the future, everyday cameras will be able to integrate with XR experiences and track our movements. Beyond that, brain–computer interfaces could be used to simulate the feeling of touch, without needing any external technology at all. Then we'll have the integration of smell, and freer movement (thanks to things like omnidirectional treadmills, that let you carry on walking in whatever direction you want).
You can read more about this futuristic vision of an XR-driven world in Chapter 13. For now, the key message is this: although it's obviously helpful to understand what XR technology can do right now, it's vital we remember XR will evolve in ways we can't yet imagine.
In this chapter, we've learned:
I've already set the scene for where I believe XR technology is headed. But what about where it has come from? How did we get to this point, where the line between the real world and the digital one has become so blurry? Turn to the next chapter to trace the evolution of XR.