Magic Leap Live | Ep. 001: Designing for Spatial Computing



hello creators welcome to the very first episode of magic leap live this is the live stream that we are putting on for creators by creators I am your host now and noon I'm a member of The Learning Resources team here at magic leap and our mission in life is to provide you the developers with nuts and bolts meat and potatoes type of information that you can take and turn around and use in your projects right away so we're gonna do this through a few different avenues we have a Content schedule whereas every Wednesday of the month we're gonna be delivering new information new content this is gonna come in the form of video tutorials written tutorials we'll have sample projects that you can download and put on your hardware once you have hardware and we'll also release the source code to these projects so you can dissect them tear them apart use bits and pieces in your applications so yes it's as a former developer myself it's really really important that we continue to deliver this useful information we're not really gonna be doing any sort of press release II fluff type pieces this is content for creators and in that vein today we have a very special guest mr. Brian Schwab from magically very own interaction lab thank you and mr. Schwab what are we gonna be talking about today I mean I want to talk about a couple of things with design in this space I want to talk about some nuts and bolts sort of low low level design input stuff I want to talk about some specific spatial computing challenges and then I also want to talk about some more advanced stuff all right sounds awesome so before we do that let's talk about the format of the show going forward so every month we're gonna start out with some news again creator news so this is gonna be stuff like the latest SDK drop or perhaps we're out there on the road doing some presentations you'll be able to come out and interact with us and some good info we'll do a community spotlight we are out there on the social media channels looking at Twitter and our Facebook page we're also looking through our forums for cool projects that you are creating out there in the world we've got a few of those today so we're gonna take a look at that and discuss some and yeah user groups once user groups spin up we'll be able to highlight some of the big meetings that are happening across the country and eventually the world so today as mentioned we have our special feature designing for spatial computing and when we don't have a special guest on eventually what we're gonna do is develop projects live on the show so we're gonna actually create projects deploy to the hardware and see things up and running now at some point I would actually like to do this I call it the Dueling Banjos type of situation where we're gonna have a unity guy and an unreal guy person developing these projects so I think that'll be fun to see how we develop projects in tandem that's for later so today after our special feature we are gonna do some QA so jump into the chat stream there on the twitch channel and preface your question with bracket question bracket and our moderators will be able to pick it up and present to us later so that we can go through them all so every month we have our announcement on the forum about the upcoming show you can get in ahead of time and post your questions there and finally again let's set expectations a little bit for the show as mentioned we have a special guest so we're gonna be talking about some design theory and design practice and believe me we are working as hard as we can to make sure that we can show the device on the stream at some point in the near future so today just hang tight and we'll get there as soon as possible indeed indeed a lot of people have been working really hard for a long time and we would love nothing more than to share what we've been up to all right with that let's roll into news all right so news recently well I suppose last month was recently the creator portal launched so you can go to the creator portal sign up for an account and download the SDK through the SDK we provide the package manager which is the piece of software that gives you access to all of the different components that you're going to need to get started with magically so we provide links to unity and unreal with the magic leap integration we have some debugging tools all that good stuff also you can check out the SDK examples SDK examples is another group within the company that are providing sort of like you know real core implementations of features so you can go and get a head start and looking at how things are implemented you can also start working with the simulator a little bit now the simulator is really good like that it's prime mission in life is to provide the ability to hook up your own engine to the magically platform and actually test that things are up and running and as always check out our documentation loads and loads of documentation that is always expanding ever-growing and the next news we have today is speaking of SDK drops as I mentioned earlier Oh 13 drop today so I'm gonna read these off on normally normally I memorized every single new feature and bug fix in an SDK drop but as it dropped this morning I'm just gonna have to read it off the prompter here so new background music player support at the C API level we also have in-app purchase support that's super exciting very cool and some fixes 2ml remote aka the simulators we like to say so gesture key points are more reliably rendered in the eye view so I guess you're gonna talk a little bit about gestures today but when we perform a gesture for magic wand we provide key points to the developer to go ahead and access and use so those are working in a better fashion we think some graphic artifacts flickering on old video cards so that's better and also image tracker there's some as to the image tracker so those should be showing up more reliably in the virtual room of the simulator all right so those are our news items for today we can now roll on to community spotlight all right very good so super super exciting to see our fledgling community jumping in downloading the SDK starting to work with the simulator we have a couple to highlight today the first one we're gonna look at this project is by CG geeks creative so here you can see the simulator we have our virtual room set up there and it looks to be a motion captured character dancing about the room you can see the live preview and unity over on the one hand side and D looks like an archer dancing in front of the Signet the virtual TV so pretty cool stuff there and next the next project is by mr. Jeremiah Bower so this one's pretty cool I like this one for a number of reasons so right off the bat there you saw there was like a green cube that the player is a player the user walked into in order to start spawning these meteors pretty cool interaction already starting to happen and we're noticing a theme amongst some of the early adopters here this is not the only project we've seen where there's claiming objects falling from the sky so the interesting correlation between pyromaniacs and early adopters is that what we're realizing yes I think so it's what's interesting to me is that the simulator is mostly made to for people to do backend integration and it's been it's been kind of amazing to see that people are still there already diving in and trying to do some interactions I'm trying to do some stuff even though you know the performance isn't quite where it's easy to do that level of iteration yet but it's it's awesome to see the enthusiasm of people going for it yeah absolutely the other reason I wanted to pick out of this particular sample and show is super super cool and as always like magic wand is an additive display so things that are like really bright and shiny look fantastic so part of the reason we're seeing a lot of fiery objects right out the gate here but the other thing I noticed a couple of things that I suspect once mr. Bower gets his hands on Hardware may want to tweak may want to change a little bit and I'm just seeing a couple of sort of best practice II type of implementations here that he's gonna want to sort of adjust it's gonna be interesting to see how these projects evolve over time yeah I mean it's been it's been fun to watch new creators and developers come in and get get their hands on the hardware and really start getting their minds wrapped around then the new medium that we're actually creating here it's it's been fun to watch people go from a screen in front of them to sort of like that that's you can think of that screen laid down on the table and desktop tends to be one of the first apps that a lot of people make they make something that's just on the table and has the advantage it still has the user kind of sitting still and being and interacted with it then they start to get to the point where now they're gonna start experimenting with hippos and so they'll usually keep stuff on the table but now you move around the table and that kind of thing and then it becomes a full room experience and the next thing you know it's all over the house it's kind of funny you say that because that happens to be the exact path I took with some of my first projects you know being a former game developer myself you know yeah I imagine oh this is perfect for tabletop and that's exactly what it was over time that project evolved and now it's more of like a widespread experience but yeah I think our community is going to come upon some of those same lessons much in the same way we have ourselves and speaking of those lessons I guess we can roll right into our special feature with you mr. Schwaab that is designing for spatial computing so I guess the best way to start is tell us a little bit about yourself what is it you say you do here yes so so I my name is Brian I come from the video game world I had a long career in there in that in that world and and I came to magic leap and I've started up a team it magically called the interaction lab and what we are is sort of a large group of mostly engineers we have some designers we have some artists but but for the most part we're a rapid prototyping group and what we do we actually sit in a systems engineering department we we do quick turnaround on any given feature that comes in give feedback give metrics give requirements back into the program but at the same time what we're doing in order to give that feedback is doing a lot of experience exploration to try and make sure that any of the requirements that we put back into the program have a very solid footing in user experience and in sort of like feel the reason we're giving these numbers to the engineering teams is because it feels good at that point the the the the kind of amazing thing is like yeah a lot of us are from games but it's not because this is a game company it's more that we're trying to do a lot of 3d object manipulation a lot of 3d scene work and a lot of that kind of thing in games people have done that a bit more than let's say web dev or those other worlds and so one of the other big tests that my team has done over the years is done a lot of documentation of all of this research and and an internal teaching like over the last four years we've probably made six or seven hundred prototypes and so there's a lot of documentation and a lot of spreading that around and now what we're going to start doing as we ramp up towards creator launches actually package up some of the best little interactions little little little bits of interaction into full source code for unreal and unity and actually put that out as a product as a set of lessons that come out very periodically called magic kit magic and so that's that's that's where my team is going to be pushing towards launch with with all of this built up knowledge that we have so some of the lessons that's even all come across and assembled and packaged up have been tremendously valuable for us in learning resources so again learning resources is a fairly young group within the company we were able to get up to speak very quickly due to the of interaction lavender and magic kids so I know we're all super excited to get that out into the world as soon as possible so I wanted to kind of dive in a bit that the interesting thing about this this particular medium is that it's it's it's very new from a design standpoint it's not just a lot of new inputs and a lot of new pieces of tech to think about it's actually a very very different design pattern as far as like how you make experiences how you're actually interacting with with the user and usually when I'm sitting in front of a laptop and I have a screen in front of me or if I'm sitting on my couch and I have a giant TV in front of me or I'm a VR experience right and if and I've got a person in a his helmet the the trick at that point as a designer is to make your world so beautiful so rich so engaging that like it drags the user into that world it pulls them into your vision and it makes them insert themselves into into the creator's reality and in when using a specialized computer and building mixed reality experiences and at that point what you're now doing is kind of the exact opposite you're trying to make digital objects that are so compelling in the world with the user you're trying to bring it out here and so you want people to stay in reality you want people to stay in their world and you want to add some additional magical elements to their world and so I I've used this metaphor a few times than it is actually it's actually it's been somewhat good is that is that it's one thing both as a creator and and as a consumer to go to a movie and sit down and watch George Clooney and play a character in a well scripted world that the Coen Brothers have created or whoever and like you have all of that going on and as many are experienced yeah and and and it's quite another to be sitting in my kitchen and have George Clooney walk in and have a little bit of an experience with him even if it is also scripted it's still it's in my it's in my room it's it's in my world as opposed to being in that world and that that all by itself lends itself to a tremendous gap in between what most of us are used to when we're making experiences and where we have to go if we want to get truly you know to mix pixels into our world and it's it's also very very different for the user and so like like it's it's not only hard to make it's it's sort of like you also have to learn the design rules to try and ease your your players and your users into these experiences right and that's why that's why I wanted to give kind of a deep design talk today because there's as much to learn in the design world as there is in the tech world and and in some cases I would say a little bit more so because this is such a new space I wanna I want to sort of like preface this by saying when I say best practices I'm not going to tell you please make sure your font is 12 points and please make sure that you use blue numbers 307 I'm gonna give you best practices along the lines of like these things are fairly magical these things you know watch out for this watch out for that and and and and that sort of thing it's more like good areas and better is or areas that have special handling well certainly we've been developing for magic leap as we are the creators of magical a point ourselves but I don't think that we're it's safe to say that we are yet the experts and I think the creator community out there is gonna really help define what a great experience oh sure we can provide guidance at this point but I think it's a journey we're all gonna take yeah I mean we find new stuff we find new good stuff and new bad stuff almost almost on a daily basis it's it's been an amazing you know learning experience working here so I wanted to talk about some very specific alike they're almost always magical things first I have four things simplicity I want to talk about some specialized audio I want to talk about physicality and then I also want to talk about multi-user stuff and so simplicity to me the number one design magic that you can that you can keep in in the forefront of your head is that the fewer pixels you use the more magic each one of those pixels feels like again you want to keep people as much as possible in the real world and if you use too many pixels it tends to start feeling like a game or a CGI scene or something like that as opposed to if you if you use just enough pixels then it feels like those magically been inserted into the real world many many times developers will come in and they'll have a character and he'll be sitting in a big pond with a waterfall going over the side of the table and there's and there's palm trees and there's things flying by and occasionally something runs past and then a week will go by and they'll come back again and the palm trees will be gone and then the waterfall will disappear and then a week after that you know they're like you know what we actually found that it's actually best if it's just the little character running around actually on my keyboard as opposed to all of the accoutrements of the of this scene it's it's it's actually much more magical when this tiny little thing is actually still in reality as opposed to this little version of its own reality and so that has again and again proven to be compelling so I would say there's probably even like one step before that so learning resources is part of the developer relations group so some of us go out there and travel them speak to our early partners and you know some of the first questions I like to ask me like what type of experience are you looking to create and I wouldn't say without fail but a good majority of them will say oh we're gonna fill the world of content and you're gonna look over here and there's gonna be this thing and a giant creature and you know they're talking about filling the whole space and that seems to be like the very first step and I think okay you know they're gonna explore this and discover it and then it kind of gets down into that sort of scale and scope that you were talking about and then they pair away pair way and pair way it's it's always beautiful just like that concept I mean even in video games for many years I don't can't tell you how many times I've said the phrase keep it simple stupid right and like it's been it's been very much more distilled in this medium right it's like suddenly now keeping it simple makes its it so much more magical the other thing that we say simplicity may have makes difference in this world is that users only have so much sort of short-term memory to kind of keep new things in in their heads and so if you're going to any any good experience key-maker knows this if you're going to make a new experience only teach a few things because you just you want them to focus on the content and not all of the things that they need to learn and new and no we're going to do that that content and we're already asking people to do so many new things you know like in the past I knew by looking at the screen where to look for content I knew where the boundaries of play where I knew I know that that's where it is I know what I'm supposed to put my hands on for inputs and and I'm very accustomed to a mouse and a keyboard or a controller in my hand now we're asking them to kind of like not have those simple boundaries anymore potentially content can come from anywhere potentially I might have to do a hand gesture potentially I might have to do all these other things and until people get used to that that all by itself is gonna overwhelm them a bit and so it's just if the initial experiences that you make are fairly simple you're probably better off simply because they don't have a lot of extra bandwidth at this point to kind of like absorb a bunch of stuff to learn to just be into an experience okay the next thing space noise audio space slice audio has proven itself over and over again to to do a couple of things if I have a digital object sitting in front of me if it is nicely audio to up with the right spatialized goodness it feels way more real than if it doesn't especially because most of us haven't actually experienced a lot of spatialized audio in our life I think that like if you go to IMAX and you have a 7.2 crazy stereo like you you've–you've you felt that but like sitting in your room I don't think a lot of people have a fully awesome mega stereo that I can really do that kind of thing and the point is it really routes things to reality you know in in the way that you wouldn't you wouldn't suspect it's actually been beautiful to like watch people interact with it and like your brain your brain only actually like interprets about 40 percent of what you think you're looking at is actually coming from your eyes else is sort of Photoshop of the mind and your mind takes as many little cues as it can you know that it can that something is real and that audio hook is another massive signal to your brain that this thing is real and because of that your brain will Photoshop it into the real world that much more firmly if it has that cue now the other thing that's beautiful about spatialized audio is that you can use it to direct users attention like like I said there's no longer this boundary of pixels here on a screen technically I can have pixels coming from wherever and users might have to be told where to look as far as where that content is and a lot of times people will use spatialized audio cues to do that the one best practice that we would say or sort of thing to think about that I would say is that we have found in our research Believe It or Not in my research I didn't know this ahead of time was that you know people have really good eyes and to the front they actually like rely very heavily on their eyes like like if I have an object here and it has a spatial cue yes that feels very real but if I close my eyes and you put a specialized audio thing here and I can't see it some people will actually point in the right direction some people will point in the exact opposite direction because their ears perceive it as a as a as an echo or some sort of anomaly there we really rely on our eyes to sort of validate our spatialized local ears our ear for for spatialized sources to the front to the back of us however this is where we have our sort of eyes in the back of our head human human sense like it turns out that just for survivals sake we're actually really really good at localizing sounds that are behind us and so let's say I've got a digital object right here and I also have one over here that's off off you know outside of what I can currently see and I want to direct the users head over it over to here instead of putting the sound right here because that's still in front of me instead if I put the sound kind of slightly behind me then I will almost always turn my head in the right and as I go that direction you can you can then actually specialize the audio where it needs to be but at that point you've now directed the users attention in a much more reliable way and that's that's actually pretty awesome because like the the again like I said directing user attention is the thing that I think the people in VR have had challenges with and we definitely have as well because we want people to be out there in the real world and to and to not have this sort of little playground we want the whole world to kind of be your playground and so that sort of user focus and attention is super important yeah I'll say rolling back to the whole making content believable through spatialize audio I think back to you when I came to interview here I went through the demo loop but there was a particular piece of content sort of mechanical character and it was flying around the room and it was incredibly engaging it was very fascinating right but as it comes closer you could hear like all the little machinery and gears ticking away and clicking and like when it was like really right right there in front of your face it really felt like it was there like I believe that I was like completely sold it was amazing it was actually interesting too and like one of the things like I think we you know you talked about fire earlier I think one of the reasons that fire is so compelling on this system like like he was saying because of additive light like fire of its own accord is is self-illuminated it's a little bit ghosty so it looks exactly like real fire like on our device and because of that you know it's it's a very compelling because there's almost no cues that are saying that it's not real right right and so if you have that I the the the interesting thing about that particular character is he's not only got a little whirring clicking little mechanical bits but there's a hot element that's bright red and heated and it kind of looks like a heatsink kind of like pouring heat off of it and there was a little bit of like air that like kind of looked that way because of that because of that we actually had a number of people that like almost felt warmth oh I did yeah like again your brain wants to believe it takes little cues and when it hears that and it sees that and it sees that that heat signature sort of cues it was very it was it was always kind of surprising to me how many people felt actual a little feeling of heat just because their body had assumed they were gonna feel that yeah I it almost sounds like hard to believe but yes there's I held it my finger and the content the creature character came over and I felt a table as it reached the tip of my finger I felt an actual tingle and I pulled back it was like you were just anticipating it amazing so the next thing I wanted to talk about is physicality and how important that is sure if you introduce an object into the real world having it follow real world rules is one of the main ways to really glue it to the world if it uses physics so if I let go of it and it uses gravity and hits the ground or if I throw it and and it uses gravity people tend to have a fairly good physics model kind of in their head and so if objects act in accordance with that then it feels like it's in the real world if if it bounces off real world objects or it goes behind things again then it feels like it's in the real world the the interesting thing is when you start a normal screen based experience or whatever a lot of times the early design push is to teach the rules of the world in mr.and mixed reality what we're doing a lot of times is we're teaching the users which of the real world rules we're still respecting mm-hmm right so like yes gravity is a thing inclusion is a thing this is the thing that's the thing and so that's that's sort of the analogy there and like physicality is a big thing the other thing is that if I have an object and I'm going to manipulate it with whatever device or whatever I'm going to use having it act like a normal physical device a physical object again routes it in the real world if I have a controller and I grab it and I can move it in a one-to-one fashion that feels way more like a real object than if I grab it and pull it towards me and scale it and very sort of computery ways like that again feels a little bit video-gaming it doesn't feel quite so real doesn't keep me rooted and then lastly the other thing I was going to talk about in this particular math magical portion is the multiuser every time we've done a multi-user experience it has been proven to be very very much more compelling than than not and there's a couple reasons for that that we have found the first is that if I see a fantastical object in my world in reality like if this thing ran out from the behind the couch right now the first thing I would do subconsciously is go holy crap you're crazy and like quick check and make sure you're not crazy and I would look to another person and and I would say look at your face and it would be like if you were just staring off into space and going okay I'm not gonna mention anything because I'm crazy where is this Alan says holy hell then I know I'm not crazy and we can merrily go along now interacting with this thing that I did that I didn't invent myself and so humans look to each other for validation on sort of fantastical things and given the fact that we now allow you to put fantastical things into the world having other humans around to validate that makes it much more real it makes it much more connected and lastly most people haven't actually done a lot of sharing of digital stuff in the real world I mean even the super pedestrian versions that we have right now are super fun like like I can bring up a Google Doc right now and just like have five people in there typing and just all by itself that's kind of fun you know I mean like we can almost make a game out of a word processor that way and we have had experiences here in the office where we've had four or five people sharing pixels in the in the room and and it's always been super compelling everybody enjoys that at a much deeper level so I I always push people to go and do the extra overhead to do a lot of the networking and that kind of thing in order to get into that sort of that sort of space okay so the next group of things that I want to talk about our are sort of very specific you know spatial computing challenges I want to talk a little bit about fov issues I want to talk about cognitive load I want to talk about what I call screen mode and then finally I want to talk about fallback inputs okay so a lot of people especially like you know the VR people tend to like say oh well you know you guys have a finite FOP is that is that a huge problem is that a massive restriction and and we we we tend to like very quickly with some with some fairly easy to do design design rules have found that it's not that big of a problem like like yes if you do something that's that's it's just sort of you know you know basic and not thought out it you can you can notice a finite fov you can start hitting the boundary a lot the content can get clipped and in it and it can you know feel odd but but with just the tiniest bit of forethought you can actually design things and just it almost magically erases that restriction which has been kind of wonderful to watch over time so so a few things that we have found that that worked really well like if if an object has it's a highly spatial frequency in its texture meaning it's got a lot of complexity and whatever effectively like your brain your brain has a lot of information coming into it and one of the things that trigger it is large changes in a chunk of information all at once and so one of the ways in which camouflage works is that you're breaking up the space by having it be and have a number of high spatial frequency noise basically all over it so that as I go behind a tree or something the I can't really see because the amount of detail that I'm seeing change at any given time is lesser than if it's not and so the same thing happens if I have a texture that hits the the fov boundary if there's some high spatial frequency on that texture it's it's lesser it's less less noticed than if it's a large white flat object it just suddenly gets clipped like that's that's a lot more noticeable likewise if you have something that is that is round and it gets clipped by the boundary and suddenly becomes flat that's actually a bit more noticeable than if you have an angular object that suddenly Flattr in a different way looks super interesting needs some of these I guess reactions these psychological reactions that we have are as you mentioned to me in the past the conversations are the result of evolution over like you know so what is it about the round object you know clipping the edge of the fov that's more offensive than sort of the flat edge okay well I mean we actually have a number of different like edge detectors in our brain effectively and and round is a different edge detector than than the straight and so like you're effectively engaging a different contour detection algorithm when that happens and so it sort of just engages a slightly different part of your brain which it doesn't do if you if it's angular already and so it's it's been interesting to me to like start to deeper learn some of these these these humanistic rules because because the the device itself is attached to me it's so much more sort of humanistic as well it's like a holistic thing now the interesting thing is like some people try to deal with fov boundaries but by just adding a strip of high spatial frequency noise around the fov directly and that's called up in yet in some cases and what it does is then anything that gets close to it kind of like fades out gradually and that that does work but that tends to me your your your your fov that much that much smaller right and so like what we have found instead is if you if you think about like okay I'm gonna focus on a particular object and it doesn't have to be the whole object it can just be the thing that I'm actually you know looking at directly and interacting with directly if that object is approximately in my fov and and and also it doesn't move around so fast that I can't kind of keep it comfortably by moving my head I drag the fov around as long as as long as I have that focused object to be about the right size and a nice comfortable speed I don't I don't actually ever notice notice the issue I can I can move that and we've had 10 12 minute long interactions where people are like wow you guys did it you solved it and we're like no we just did some good design and and it and people don't even feel it anymore the next thing I wanted to talk about is cognitive load when I'm sitting at my desk I have my hand on my on my on my mouse and my keyboard and I and I've got my eyes pointed directly at my laptop screen I'm I'm completely getting all of my visuals from right here my hands are like in in this beautiful spot where I can just be really really quick and do whatever I need to do or if I'm sitting on my couch and I'm watching my huge television I've got the entire rest of the world tuned out and I have my hands wrapped around you know 17 axes of control right like like a killer joy and joy pad or whatever um for all intents and purposes I am completely jacked into that experience like like the experience can consume me as a resource and and assume that my entire attention is on that experience but in the real world if you want people walking around if you want people to still actually be a part of the real world you can't consume them as a resource you have to leave them enough bandwidth to still operate in the real world it's sort of like if I go back to my George Clooney thing if if I go to the the movie theater I expect you to basically not do anything except watch me perform the movie but if I'm in your kitchen and I'm having an experience with you like like if the phone rings you know George better shut up otherwise he's not a real good houseguest you know I mean like like like we kind of end up having to kind of be good house guests when we're doing these types of experiences because you still want to be the person to be in the real world you want them to be able to walk around with high clutter on the screen if on the world if if their dog walks in you want to be able to kind of take that into account and not and not have them you know step on their dog accidentally or something you know all of these things like you basically need to to leave the user enough horsepower enough CPU to still operate in the real world and we're actually doing a lot of work to try and like codify a little bit how much CPU if you will particular interactions are taking so that developers can actually budget out how much they want left we're based on what type of experience they're making is it a casual experience or more of a skill based twitch experience and that sort of thing and so that they can actually start to take that into account in their design um one thing you've probably seen some of this already happened which is that you'll see a user walking down the street with their cell phone in their hand and they'll be typing away and suddenly they'll stop dead because they've like gone gotten up to a staircase like their brain has said you don't have enough juice to step up the stairs and type and so they'll just stop right like that that's exactly what will happen in your your mix rowdy experiences if you don't give people enough enough juice to keep to keep doing what they need to do in the real world and that leads me actually into my next thing which is screen mode this is actually a super super important thing that we've kind of like stumble upon which is pretty interesting which is that if you do have a finite fov and it's in front of you right if you put too many pixels on it and you fill it completely suddenly your brain will kind of pick up on it and it'll say oh I'm looking at a screen I'm looking at a fancy screen instead of looking at magical pixels in the world and we've been staring at screens our entire life this is something that we've we've trained our whole lives to identify screens right like I can identify a TV at an oblique angle from 50 feet away and I tell you exactly like oh if I'm in if I'm interested in what I think is on that TV like it tells me to do a couple of things right it tells me to sort of Center up on the TV it tells me to sit still because TVs are basically a static they don't roll around usually they're not robotic right and and and it tells me that like I'm gonna get everything I need by just sitting there we've actually found that if you go into screen mode if you put too many pixels in even for a short period and put people into screen mode they will tend to stop moving they will tend to say oh I'm looking at screen time – time to pony up time – time to post up right and and and that's bad right if you want them to be out in the real world if you want them to be moving around having them suddenly go into that mode where they think oh time to stop is a bad thing it takes them out of reality I mean I'm sure a lot of you have walked into a room where somebody's deeply engrossed in a movie or something and and you can kind of do this they don't even really see the rest of the world right there they're there into the screen right there into that so you don't want to do that the other thing to remember is that is that it doesn't it takes you out of out of the real world you don't even see the real world anymore you know it's just I'm looking at this fancy screen now and so unless that's your goal be cognizant of that right like like don't don't don't do that we've actually found that it takes it a few minutes to kind of get people back it's gonna ask so we've been trying to quantify this like one screen mode engages yeah it depends right but but like that is typical it does take a bit you know we've even done some experiments in VR like if you have a VR experience and you put an artificial television in the scene it tends to engage the same reactions from people people tend to like stop moving in the in the VR world and sort of post up on it right the the it's it's it's I I even saw a speech at GDC this this year where they called it a mr paralysis or something it's it's been fun to watch people like have experiences where they're moving around fine and then all of a sudden they'll just stop and you're like oh an explosion must have went off and filled the field of view for a short period of time so keep that in mind if you're if you're building stuff and finally fall back in place I when when I am in a sandbox game let's see and and I have a number of objects in front of me and I want to like select one of them I can potentially just add a single ray cast from the end of the of the character's finger to this scene and collide with with whatever and figure out what it is that I want to select but in the real world you run into a bunch of problems like potentially I could have some bad meshing here because it's too shiny and black and maybe there's a hole right here in the mesh that hasn't quite gotten around to getting getting filled in just yet or maybe there's a mirrored surface and it's actually reflecting some of the IR which causes in some cases a little bit of like phantom geometry which maybe I'm gonna collide with instead of this object and lastly like meshing isn't a hundred percent real time and so I might have actually had my cat sitting here for a while he actually got meshed into the scene and then jumped away and even though he's not there now it might take a second or so before it gets cleared away and now it's back to normal and in the meantime I'm trying to select this thing and it doesn't hit it because it doesn't quite that one ray cast doesn't quite get it in reality we want to use a bunch more of this input goodness that we have now that we have a wearable device to overcome all of the crazy variables that the real world is going to throw at us there's there's just too much variance in the real world there's things that could happen because of light variation there's things that can happen because of all sorts of issues and and so because of that having a number of fallbacks is is a custom practice in in robotics you know I'm gonna make a robot that's gonna go to Mars or something like I tend to have a series of fallback inputs that can handle the same problem and make sure that they that the system is robust and the system's gonna give me what I need but likewise I now have all of this extra information in the past if I went into an npc and he said select one of these you would push forward on the mouse or push forward on the joystick and that's literally all you would get is like that single vector and like that that did mean you know shoot the Ray and kind of figure it out but now if I am a person and I ask another person select one of these objects he's gonna do a number of things he's gonna look at it he's gonna point his head at it he's gonna probably point his finger out he might even like gesture with his head he might be like that thing right there and then finally he's gonna say that orange dog and so like all of those pieces of information happen all the time that's just people's almost natural reaction is to like over give as far as information goes and we now have the ability using our device to kind of pick up on all of that right I can use head pose to kind of like determine that I'm gesturing with my head or that or that I'm like over that way over that way you know I can I can I have both gesture based stuff as well as the control the the totem controller in my hand to sort of determine where I'm pointing I have my eyes I have you know again some voice control that I can use to determine where the disambiguation of that target the other thing that we've been working on pretty heavily is actually blending those together so that I can have a single targeting system that like allows me to specify targets that I want to be more skill-based versus more sort of like psychic right like if if I really want it to be more skill-based I should probably blend a little bit more of the hand in because that's that is the the primary sort of skill aiming mechanism that a lot of people are used to as opposed to if I have you know user interface up on the wall or something I might just want to look at it and and and like hit a button on the control and I don't actually have to point at it at all I can just look at it and that feels very sort of magical and it's the same exact targeting system is the same exact blending system it's just that these targets have been blended differently based on the specific tasks that I like and the type of feels for the experience that I want that I want to make so some of these solutions that you've been working on in the lab and have you know assembled and put together these are the types of things that are gonna be shared through magic kit yesiree to the world yeah because this is a very holistic device right like we're actually attaching it to your head there's a bunch of sensors that work in in in pretty close harmony to give you all of these inputs we have found over and over again that interactions in specialized computing tend to be fairly holistic solutions you to tend to use eyes and head and controller or voice or all of these things together because like like I said you want to make the system more robust by including all of that information but secondly you're getting all that information kind of naturally just because that's how people act and so like you want to include that because some people might not do one one of those things some people might not do one other of those things but but by collecting all of them you kind of are getting the large swath of humanity's natural inclination to do in that in that way and so even somewhat simplistic interactions tend to be a little bit more beefy in the real world right and so what we're doing is we're actually packaging up good low-level interactions that actually have a lot of that meat to them so that you can have that source code so that you can bump up your interactions from very sort of you know lower fidelity to sort of like much much more robust and kind of like workable solutions right off the bat so the last thing I wanted to talk about today is some of the more advanced stuff that we're working on that we we're very excited for because it sort of gets into the the really fun stuff right and I really just want to kind of high-level talk about a little bit about environments a little bit about characters and a little bit about story we're actually doing some work right now to provide tools for a couple of different environmental tools that the devs can use I mean you can think of environments as first your first level of sort of semantic experience understanding is that this is this is a meshed object of some form right this is just an object and then next it's a flat object right and then next is F above that it's a table and above that it's a wooden table and above that it's it's your desk right like there's so many levels of context that you can slowly start to pull out of the real world if you can if you can start to understand environments more and more and more and we're working on tools to try and drive that context as high as we possibly can and but above and beyond that we also have what we call affordances and-and-and room solving sort of situations so like if I do identify the chair that I'm sitting on affordances are like how do I use that chair if I have an artificial entity let's say that that wants to use this chair I need to know approximately where the the character would need to sit if he was gonna sit on the chair and also what like what direction is forward so that he can kind of like sit on the chair in the right way and not be backwards but his leg up in the air or something and and likewise you know couches or or other other devices you know where's that where's doorknobs those sorts of things like using affordances becomes a thing that you you want information on if you're going to be using environments in a higher context way with characters we have a huge opportunity to to to respond to all of this these inputs that I was talking about before so again I have head pose if you're an artificial character and you're asking me questions and I lean in it could be if I lean in like this it might be that I can't hear you if I lean in like this it probably means that I'm interested in what you're saying I'm like oh really right if I if you ask me a question and I kind of like look up and to the left psychology tells us that I'm probably trying to remember if I look up and I go to the right I'm probably trying to invent something and so like it's even just as simple as if the character asks me a question and I sort of look off into the distance and you're like hey bro eyes are over here don't look at that thing look at me what's so important about that chair you know I mean like if if a character and you're in your room started talking to you and that way you're gonna be much more much more compellingly you know connected to that thing the other thing to think about is that these characters or or entities are in your real world and because of that you're you're in a little bit more of a of a vulnerable place you know like like if George Green comes over he's gonna is he gonna be judging my couch is he gonna tell me like this is pretty lame you know I mean like yes but but the wonderful thing about that vulnerability is that it it provides a greater framework for actually bonding with the creature you know or with George Clooney or the creature George Clooney I hope George isn't watching this and so so we have that that that awesome experience that that opportunity to make these deeper experiences with characters because they can start to actually talk about the things that you're doing directly with them and and the things that they see if they have that that environmental context that I was talking about and lastly again with story like like I think a lot of people worried that we're not going to be able to tell stories in this like crazy mixed-up world where I can't make a set I can't have control over the environment and I can't like know exactly what it is you're gonna do but like you know we have dinner theater where people run around and and and do really awesome stories right now we have you know stage magicians out on the street where they interact with people and they tell a story right there and they get in your face and they do all of these things like we have clear-cut examples of this type of storytelling or or many others already happening in the real world and you just have to like adjust your way of thinking like you might have a character that that is up on a thing in you and he's gonna run around the room and do some things with you and like the story might be fairly simple it's more just like I want to get reunited with something and the game involves him getting there and in some cases he might he might you know be able to jump down and in some cases he might ask for your help like like that's this interesting notion of like the the character could plan you know a particular story beat out and he could say well I need to I need to get over there in order to like do this next part of the story and and so I can run up planning algorithm at that point and I can say well I can't get down right now but the one thing I could do is use the human to actually get me down like we have these these beautiful opportunities now where like as a character in your world in the users world the user can be involved directly with the character with the pixels with with the solution to the problem with with the artificial entities solution to the problem and all of that is is is a pretty cool thing that we hadn't we haven't really had the ability to do before in exactly the same way yeah it's I see it almost as a spectrum like it on one end if we have sort of the old-fashioned linear cinematic experience right where it's just and you know for years in the games industry we said well now stories can be this new thing where the player has control and you know but we're still borrowed a lot from that old linear experience now with this new platform it is just a whole different scene a whole different ball of wax you know you don't have set control over the environment and it's good and I think AI is gonna play a lot more into it so that our content is aware of the environment aware of what the user is doing it's a lot of experimentation here we actually have talked a lot about the fact that like the uncanny valley in the past was a lot about just visuals like like this thing doesn't look exactly real and therefore it looks kind of creepy or you know the uncanny valley kicks in whereas a lot of times what engages the uncanny valley and in mixed reality experiences is is more that the reactions are not right that the awareness of the real world don't quite match up with what you'd think as opposed to the visuals and so that that is starting to become the new trigger for the uncanny valley and like you said that usually that's an nei system and like contextual awareness that ends up helping you out there so finally like there's there's you know I'll say there's a lot of things that I didn't cover you know like like we have a long ways to go we have a lot more to teach like like just making things feel like they're real in the real world takes a lot of work right like like we we are an additive light system and so adding shadows to things are is actually kind of impossible it's not hard there's ways to do it you know like I can shine a spotlight on something and now I have light here and I cannot put the light where the shadow is supposed to be we should say this there's all kinds of stuff that we super cool stuff so so don't don't think that I've given you even a a large majority of what you need to know and please please keep keep looking at this space because we have a lot more to give all right fantastic yes definitely gonna have to have you back on to talk about some of the other stuff we didn't cover today but we are now gonna go into the question and answer section here so our moderators have been moderating the twitch chat oh goodness we have a lot okay so let's just roll here let's start this question will these meaning the livestreams I presume be recorded and posted to YouTube yes absolutely so of course we are here live on Twitch we will be archiving these on YouTube you'll be able to find them we'll add all the keywords and relevant links to the comments all that good stuff I actually see a question about that my glasses I actually have an old man cable on my glasses because I do a lot of taking them off to put on and you know our device kind of all day long and so like I'm constantly doing that and then picking them off and putting our device on it so that's that's why that's there I don't have some sort of weird future glasses actually that is an excellent topic because I'm gonna use I'm gonna I'm gonna guess this top this question is down here on the list somewhere but I'm gonna sort of pre-emptive so you have to take your glasses off to use the device currently we have gone on record running has gone on record in the past saying that there are going to be prescription lenses for your devices so that is currently work in progress and we have no date to announce on that right now but never fear our glasses wearing friends all right let's see I have a project that I think would be awesome to feature on the stream how do I get in contact with you about it okay so yeah again we have our moderating crew our social media crew out there looking at the Facebook page and Twitter and also on our forums the forms is probably the best way go ahead and make a post show us what you're working on give a link to an animated gif a video whatever you got and hashtag that with made for magically and we'll be sure to pick it up so we'll be compiling these and we'll pick out a handful every month to look at if you are really really interested in having your project featured then when we reach out to you we want to make sure that you're cool with us on the show so be sure to respond give us the thumbs up okay let's see I see that you support unreal and unity why don't you support and janae or B or some of these other platforms yeah so unreal and unity are the the starting engines that we support we look to expand that over time and certainly we offer the hooks for you to go ahead and integrate your own engine right and you can test that out with the simulator as well but first steps we're still growing we'll get there yeah we actually already have a couple of even external partners that are fully integrating their own engines directly with RC api's that's right okay let's see do you have a recommendation for how to best set up the simulator bindings for the controller so okay we have a user here working with the simulator no recommendations but I will tell you that there has been a recent bug fix that I encountered this myself I was unable to to fire off one of the trigger events or something so we are actively working on the simulator and the input bindings there's new fixes that are in the latest drop so if you are having issues before hopefully that is remedy so let us know all right so you kind of hit upon this next question when you're talking about the specialized audio and like the question here is have you found any best practices for indicating to users that are that there is something they can interact with that is currently out of you now you and you've brought up the specialized audio right like you you can have a sound fire off what other sort of techniques can be used I mean we have found a ton I mean it's it's what's awesome is that like by being able to put pixels wherever you want and being able to kind of like kind of just in time those pixels you don't have to take ups quote-unquote screen space you don't have to like do all of the necessary things that you would in a normal game let's say but you can you can kind of on-demand do things which is which has been pretty powerful like so you can Telegraph that you're that you're over there just simply by like if you do have a vignette you kind of color the vignette like there's something over there just a little bit in that direction you can you could have what works really well is sort of like a mini-map we're like if I look at at an area that I've designated as a mini-map placed like let's say over there I maybe see a small model of the same room that I'm in that shows me where I am in relation to some other digital content I can I can you know just recall items you know like like even though they are you you want to route them as much as possible you could like have it draw a yellow line on the ground between where it is and where you are so it's sort of like leads you there based on where you are looking and and lastly like again if you never lose the person in the first place that tends to be pretty good and so we've found ways like like again moving at about the right speed or like having a little trail maybe of magic and that's why they're flying in the first place and that gives you just enough to kind of catch them back up if you lost them momentarily those sorts of things so it's very application specific and I think that there's a there's a actually a huge number of really good ways of doing that fantastic alright so actually we should have addressed this one early on let's roll back to the top of list here what is this spatial computing stuff you all keep talking about and is this like computing volumes or something so what do we mean exactly so so as a company what we're talking about when we say spatial computing is actually just that our device itself is a spatial specialized computer it is a computer that knows where it is in space it has sensors that can pull information from the space around it and it can build up an understanding about the world around it that is what a spatial computer can do and our device is a spatial computer and so spatial computing is what a spatial computer does but you know the the larger ecology basically has used at several different terms you know mixed reality augmented reality and and those are the types of experiences that you can build if you have a spatial computer so that's that's kind of the the relation between those terms all right very good and let's see a couple more questions here eventually could we have a basic multiplayer example project to build on folks it seems like anything where multiple people can experience the same thing will be much more compelling yes you probably mention it it's true and that is definitely something we will we will do though the one thing I would say is that there's a bit of as anybody that's made a multi like like a multiplayer game let's say there is a bit of stuff that kind of sucks like like in the infrastructure the infrastructure so so like am I gonna have my friends list am I gonna be able to connect with them you know that sort of thing like if it's very very sort of simple you know like like serve a client like like obviously that can kind of happen pretty quickly and we can put that out there but like that doesn't represent sort of this state of the art as far as multiplayer type environments go and a basic multiplayer example of some form we will definitely do but I don't know if it'll be a triple-a sort of situation because we're very specifically trying to make all of the magic kit stuff very simple so that it's super easy to pick up on it and it's super easy to add to it and so we are definitely gonna go there because yes that is a very magical and compelling part of the experiences that we have found yes speaking for the learning resources group I think this is something we can also tackle so you know actually this might be a good time to kind of delineate sort of what is the difference between learning resources and interaction lab and like the way that I say it is learning resources offers the fastest path to success right like what are the three things I need to do to get this feature working so in the case of this multiplayer example you know we could have a tutorial or a sample that just does the basic you know server client thing hooks it up and you're both looking at this piece of content right and now from there I see interaction lab is like best practices like learning resources how do I make it work okay now now that I've got it working what's the best way to use that fair to say do you think sure I mean I think the other thing that we're trying to do is instead of just sort of like the basic way we're trying to like inspire people into a number of different ways right and so like like I said in some cases best is a weird word because we we keep finding better stuff it's not such a high rate that like I would say we we try to find as much sort of very spatialized ways of doing things as we can and we try to think outside of the box as much as possible but but like best is a weird word for me but I hear you right okay and okay the most important question definitely have to get this all right so this one is for Brian would you rather fight one horse-sized duck or one hundred duck-sized horses I'm gonna go with the the big duck I've fought you know 50 small horses before and they can kick like you wouldn't believe and so like a hundred is I think just gonna be too much and so like you know that big duck is mostly neck I'm gonna be able to get out of its way it waddles I'm gonna get my arm around it and then it's over yeah seems reasonable I'm pretty good at melee okay all right very good I guess that's enough Q&A for now so we're gonna wrap it up here Brian thank you very much for coming on the show and thank you to interaction lab for everything you've been working on for the past couple of years here and yeah we'll be looking for all that good magic content we really appreciate the download and yeah so shout out to the early adopters again thank you very much for all you out there in the community developing your early projects really good stuff keep it coming again hashtag with made for magically calm and we'll pick it up and we'll be reaching out so that we can feature it on the stream let's see what else do we have to do always the forum yes hit up our forum the next announcement for the next stream will be coming up shortly so go ahead and jump into that post ask your questions ahead of time so we can address them and yes always the social media channels and I think that's it so first Wednesday of every month the next show is June 6 2 o'clock and join us on Twitch and yes thank you very much for joining us let us know how we did today

21 Comments

  1. Stephen Brule said:

    Useful content from Brian, which starts 10:00 minutes in

    June 29, 2019
    Reply
  2. J Smith said:

    I stopped at 43 minutes realizing enough time is wasted already.

    June 29, 2019
    Reply
  3. luvinlyfe4 said:

    [Is there facial recognition?]

    June 29, 2019
    Reply
  4. Dragon seven said:

    get rid of Mike pstakis and get a real electronics engineer

    June 29, 2019
    Reply
  5. Judith Beaumont said:

    How did you get all those pink books?

    June 29, 2019
    Reply
  6. markymark2036 said:

    So, for the interface and interaction, can the device detect hands to allow for poking and grabbing the virtual objects? Or is the totem the primary device to allow touching the objects?

    June 29, 2019
    Reply
  7. Samson Sliteye said:

    mr schwab? you guys work together and address each other with your last names? 0_o

    June 29, 2019
    Reply
  8. Ludwig Ederle said:

    Are you only organizing or also reading your books by color?

    June 29, 2019
    Reply
  9. Skyler Enola said:

    So it's a talk show?

    June 29, 2019
    Reply
  10. Jorge Rodiles said:

    I hope you sow real magic this time and not just jellyfish floating around…. besides your processor what is difference between you and meta2 or ms Hololens? META 2 does a very good job sensing hands in real-time… and you have only one control like daydream ….. I don’t understand WHY magic leap have raised so much money to be founded and you only deliver nothing but smoke and mirrors … actually more smoke than anything but let’s see what happens on this episode…..

    June 29, 2019
    Reply
  11. Film Focus said:

    Here is a really good tutorial for beginners who would like to get started creating content for Magic Leap. It's 20 minutes long, but it basically covers everything, step-by step, that you would need to learn. https://youtu.be/RkUUXo4Aufw

    June 29, 2019
    Reply
  12. superjaykramer said:

    What a crock of shit.. This has got to be the most retarded channel!! WHERE ARE THE FUCKING GLASSES>>>>?

    June 29, 2019
    Reply
  13. Scott Ashton said:

    Echo blah blah – where the F is the hardware

    June 29, 2019
    Reply
  14. get lost said:

    Blah blah blah… too much talking, only talking, nothing more than words… what kind bullshit is magic leap anyways?

    June 29, 2019
    Reply
  15. S X said:

    It's funny that MagicLeap have virtual elephants when the REAL elephant in the room is where the actual headset is….

    June 29, 2019
    Reply
  16. Watts Designs said:

    Is he wearing concept glasses?

    June 29, 2019
    Reply
  17. Oculus Rift said:

    The lava dropping from ceiling looks really bad graphics, VR is so much better already lol.

    June 29, 2019
    Reply
  18. Jason Hunter said:

    high correlation between pyromaniacs and early adopters;)

    June 29, 2019
    Reply
  19. Imagine Image said:

    Your a tech company not a circus talk show..

    June 29, 2019
    Reply
  20. TerraBlast2012 said:

    Except all those weird gimmick, everything is already existed in Microsoft mixed reality, now 8k Pimax already out, and soon 8k MR will out soon. Hopefully Magic Leap survived.

    June 29, 2019
    Reply
  21. Vanishing Sun said:

    The medical applications will be fantastic! Being able to walkthrough a scan of a patients heart or brain will better our diagnostic ability! Get this done!!!!!!

    June 29, 2019
    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *