036 - Emotions in VR, AR and more with Tom Emrich from Super Ventures

Show notes

This week on ResearchVR we dig deep into how can emotions be measured and incorporated in VR, AR, and other digital technologies. We discuss it with our guest who is well versed in digital consumer technologies, Tom Emrich from Super Ventures.

Topics

Here is a list of most prominent issues we've talked about:

  • world building in VR vs AR
  • basic emotions
  • tracking emotions
  • going beyond labs
  • affective computing
  • technology accessibility
  • technology acceptance
  • wellbeing optimization
  • use of emotions for itself
  • metrics of emotions
  • use cases of emotion metrics
  • privacy and security discussion
  • consumer hardware now
  • search by emotion

Links

How to subscribe and contact us

For more details tune in to another episode of ResearchVR! You can subscribe to our RSS Feed in the mp3 or acc format. We hope you learned something new and enjoyed listening.

Thanks to Tom for sharing his expertise with us and our listeners.

Topic Suggestions, Guest Invites, and all other all feedback are helpful! Feel free to contact us under researchvrcast@gmail.com or via Twitter:

Rate us on Itunes and help spread the podcast that way!

Show transcript

00:00:00: Music.

00:00:13: For like the state of humans or like what you'd want to be in terms of physical and digital form I think.

00:00:21: Like the reason why I got into all this is because I love that wearable technology is stripping away the layers of humanity to help us understand who we are as humans and what is real what isn't real.

00:00:31: So I think with wearables right now we're getting an understanding of.

00:00:35: Apple ER is a logically and biologically the next step is emotions and unlocking khoobiar from an emotional perspective and then the third phase is from a mental perspective so that's like.

00:00:47: I wearables as Quantified Self phase one then a motion to affective Computing and then be Citgo and where I really want to see it go is to figure out,

00:00:57: what is a soul I want to find a soul and I want to talk. 125 leewright you're you're putting the technologies that are going to be important but how does that manifest itself.

00:01:07: What is like a day in the life of in 30 years with having BCI with having emotions,

00:01:13: and how is it different from today I think we just have better awareness of what we are and what our world is and.

00:01:22: I believe that especially with a virtual reality it's really going to play with our minds from what is real perspective which reality is real kind of like this whole talk around simulations I could see as 50 years from now really understanding.

00:01:35: What does body is and who we are inside this body and how does body interacts with this world versus the other worlds that we enter,

00:01:44: and so I just feel like in 50 years the people that we will be are just much more aware much more enlightened much and where,

00:01:52: cognizant of the tools that we have and then be able to unlock new ways to communicate new Wheels new ways to connect new ways to experience new reality.

00:02:04: That's kind of a journey that restraining especially with BR I think it's funny cuz we always tend to like think that we want,

00:02:12: that we're going to have separate worlds to go into and it's true to some extent but we don't consider it like.

00:02:17: Instagram and Snapchat to be different worlds then you can like they're all part of the same world and then they're kind of like the lenses on top of it.

00:02:25: I don't think we'll be having that metaverse we're at like we're all really expecting that have NVR but.

00:02:31: Expected Tom in a link to reality and a very location-based sense even if it's like the interactions that you have and they're not worldly at all or the things you see you're not very Earth-like.

00:02:45: It's it's it's funny we always try to get away from this where.

00:02:50: It's more fun to actually use this and then add your own layers to it rain all right which is where augmented reality comes into play it right the,

00:02:59: augmented reality future is a hyper personalized.

00:03:03: This world where is virtual reality the dream is that if I wanted to be a scent or tomorrow I can send to her tomorrow cuz you wouldn't want to be you know to me so for people dropping into this conversation three minutes and,

00:03:16: hello and welcome back to the reason for your podcast, Tom emrich is back from Super Ventures and I am your host balabanian,

00:03:24: we were just having a cool conversation and I just and I click record and I want to see you I just wanted people to drop in and see what the hell we're talking about.

00:03:33: And speaking of which today we're talking about emotions and not just our own emotions and not just and I'll give you guys of the deepest darkest.

00:03:42: That we've ever had by more about emotion Tech and what you can actually do with understanding your emotions.

00:03:49: At any given certain time understanding friends and what how that translates into you know the current,

00:03:56: the current industries that exist and IT Tech and whatnot.

00:04:02: I guess we can start with like the basic and there's there's a few basic emotions I think there I'm and I pull up my research I should have pulled this off from college but like there's like five basic emotions.

00:04:15: There's happiness there's sadness fear disgust.

00:04:19: And surprised to know that there's actually a few more that are usually categories like there's people say 7 to 9 actually basic in emotions but like,

00:04:27: it usually boils down to five and that's why,

00:04:29: and Pixar when they're making inside out they chose those five of that back she could have two characters these five emotions are actually.

00:04:41: Omnipresent and monks all mammals actually and biondi.

00:04:46: The point that that's actually a thing that connects us as as beings of Earth is like these emotions and those emotions,

00:04:54: turns out they actually died your memories and they guide your memory creation your memory recall how you store those memories so there's a lot of interesting.

00:05:04: Boxes that emotions opens up and up until now and I saved you know we were just scratching it but like we haven't had a real.

00:05:12: Fast way of tracking any good giving users emotions.

00:05:16: And today will just be talking about me know where we are with emotion Tech now where we see it going and and how we're going to get there and I think it right up until now,

00:05:26: emotional measurement has mostly been in Academia and Labs it by advertising and marketing companies were to do this they would have to hire,

00:05:34: and have a folks come into a room to do an elaborate Opus group,

00:05:39: that's what we're really saying is more the democratization of these technology so that they're starting to creep outside of the lab and start to be outside in the real world which would allow for companies to really get a sense as to.

00:05:52: With our users are in their natural habitat which is really really and pack.

00:05:56: I saw a company like a fact Evo which uses an optical sensor like a camera to detect facial recognition they actually have the ability to recognize seven emotions they will be on the 5 that you talked about so.

00:06:10: I'm in terms of that seven it's anger sadness disgust Joy surprise fear and contempt.

00:06:15: And they're also able to measure things like valence and engagement and so,

00:06:21: the real reason why this is all possible is really because of the growing ubiquity of sensors around comes from mobile right Always Mobile mobile it up the world,

00:06:33: exactly right so we have on our phone we have.

00:06:36: We have the accelerometer which is motion but in this case especially with efectiva the cameras really wear a lot of that data can be gathered,

00:06:46: so what is a calf activity I'm looking at their website right now if what if technology could adapt to human emotion but they're landing page,

00:06:54: hcso ba50 the group comes from the MIT media lab and they use computer vision,

00:07:00: leveraging Optical sensors to detect racial expressions and then from those facial expressions they can determine specific emotional states,

00:07:09: and I can also determine they say 15 facial expressions as well as gender whether not that person as eyeglasses,

00:07:17: I'm and they've analyzed over 4.8 million faces today utilizing their life or more 75 countries cuz I mean you know emotions are Universal,

00:07:29: singer down today the basic levels of it but there's still obviously variations in terms of how people expresssos emotions per culture so there's,

00:07:36: is variation and it's good to have the full span of the earth it seems like.

00:07:41: Try not and what's really interesting here with a factiva is that they are a great example of how.

00:07:49: You need to have a wide. Asset in your database to be able to teach your algorithms in order to have better machine learning,

00:07:59: which now then allows for higher accuracy in the emotional measurement,

00:08:05: so with affective Computing or emotion tack the ingredients are sensors,

00:08:10: that are other Ambien that are just out there like a camera for example or that are worn like a wearable with DSR sensor or the ability for HRV measurement,

00:08:19: and so that's the data the data is collected that's the first foundational air and then a daughter needs to be ingested and sent to a platform that platform needs to have deep learning so the use of algorithms machine learning to be able to make sense of it,

00:08:33: and so the more that you feed your your brain that that a basic machine learning platform you have higher accuracy in your results.

00:08:45: Masky and so when you're taking a look at companies that are doing emotional measurement you really want to have a better understanding of the scale of faces or the scale of data points they have been able to,

00:08:59: analyze over a course of time that's really what's going to allow for their platform to win out in this type of race,

00:09:06: and we have seen a couple of of players here utilize different sensor point so like a factiva,

00:09:13: I'm is an example of a company that is using optical sensors to determine emotion based on facial recognition is also the case for a motion which was a company that Apple Bot by January of 2016 and actually,

00:09:27: best sector here and we've seen Microsoft dabble in and there's been a number of different competitors within the facial recognition space when it comes to emotion and engagement measurement,

00:09:39: you can see why because I like you mentioned the One sensor that we carry around with us all the time,

00:09:45: is the camera the camera is in his growing to be the Battle Ground for a mobile odm's right now I'm also we're seeing companies state that their camera companies like Snapchat which does an IPO,

00:09:58: you can see how leveraging a camera to determine the emotion of the user,

00:10:04: could unlock much more personalized experiences and also allow for all of these companies that rely on the mobile device to better understand their user,

00:10:12: yeah I'll be on face we're seeing voice recognition using vocal intonation like Beyond verbal and just thinking I mean the question I've always asked why can't Siri understand that I have a and every night and write a sentence.

00:10:26: You know Angela? Instead of me saying? At the end of The Voice tonight.

00:10:30: I really don't think it's that hard I really don't I mean there was emotional measurements possible now so out of this world like.

00:10:40: Seven years ago like I'm just talking to my professor is like very we had a full class like a full week long class as to why NLP will never exist face it was like showing them favors from Xena deep,

00:10:53: Define the burning laws are like they've already uncovered like making voices sound real and as well as in a voice voice to voice.

00:11:02: Processing soap.

00:11:04: It's it's growing at an exponential rate in a and it is really Guided by Machine learning into the processing aspect of it and so there's there's a lot going on but I think.

00:11:15: Some Voice assistance are better than others now and you know Siri that used to be pretty much top of its great at this one is kind of falling behind with motion but.

00:11:25: I think that's that's a current state of emotion tracking right it is kind of like voice and faces but like we don't right now I don't think I've use a single app.

00:11:34: In the last month that has like facial tracking and it knows, feeling great right I mean it's it's still something that is.

00:11:42: With me where is it or what would he consider the current stay there actually are three three instances we have faced voice and then wearables so wearables that measure organ heart rate your skin the rich area exactly and also you're at the electrical,

00:11:58: microdermal state of your skin which is a good Sweat Right and so there are solutions out there,

00:12:07: and there are startups that are focusing on it weird why are we not seen them in integrated into Uber or integrated Spotify or Facebook using them mainly it's because of the fact that,

00:12:18: the sensors Associated wearables sensors aren't on everybody's body and and also there's some work to be done still to be,

00:12:26: to really educate the end-user on.

00:12:32: Having their emotions being measured there's a lot of privacy concerns around that it's a very vulnerable state you know one thing to know what your behavior is online,

00:12:41: but and there's another to really understand how your feeling about what you're doing online.

00:12:47: I always the privacy concerns are probably the first thing that would pop up in someone's mind as to why they wouldn't want right emotions but I think the benefits.

00:12:57: Which aren't really there I mean it's hard to convince people of them yet but like a benefit so totally going out with Rob acts of having.

00:13:05: Are the privacy concerns remain in the end but you know if you would do have an anonymous goes into.

00:13:12: I don't know that they're trying to figure out how to democratize crowdsourcing data and still keep it.

00:13:18: And not traced back to a single person so.

00:13:23: I think those privacy concerns can definitely be squashed if that use case itself is promising same with ar same within at whatever other,

00:13:32: hunting at technology that we're talking about how we actually might have had a few experiences with like emotional back in the thing,

00:13:41: that I should got me excited with music music is very emotional as we all know.

00:13:46: I think we go through different cycles of the different music that you listen to that is also very much. With what you're feeling at the moment.

00:13:56: To the point that this is probably little known feature that last Last. FM had.

00:14:02: I don't even know if they're people that use using that service the but basically they had something called scrub a scrub is probably growing where it just basically pulled in your iPod music whenever you listen to something pulled out data,

00:14:15: every time you plugged into computer and it logs at and so you have a full list of like everything to listen to with your iPod on your iTunes whenever.

00:14:23: A Spotify as well and then they actually had this plug-in that charted it,

00:14:29: over emotions over time the x-axis was Time watches his emotion and it's with overlapping kind of like sinus or the waves of like okay this this month you're listening to more upbeat.

00:14:40: Faster whatever it over there it's next month is sad and it turned out you know I was actually.

00:14:46: Having the seasonal affective order sad this order right like the winter time sadness right like I really wasn't I got some of that Lana Del Rey song when she and,

00:14:58: summertime in that really excited me unfortunately that service actually shut down within last the fam and it's so sad because I was I was something that I would go to every six month,

00:15:09: 6 Munson login and then look at it again and then compare it to my real life experiences just everyday life things I want mean.

00:15:19: Those if they're presented in the right way I think and totally change people's decision-making when it comes to doing Thanksgiving aren't very good at understanding there on Trent's I think we we are very,

00:15:33: distraught by like.

00:15:34: Cognitive Icees bye bye confirmation by some means we we we like to make our own rationale for the decision that we make rather than facing the data.

00:15:45: Yeah I'm and,

00:15:47: I think we are already starting to see the quantified-self on the emotional level occurring after example of company,

00:15:56: I called violas creating a wristband that basically will,

00:16:00: I'll average skin temperature and GSR and HRV sensors to be able to measure your emotional time on a daily basis and then mapped out against day after day week after week.

00:16:13: And then help you better understand if you're meeting your emotional well-being goals and then also send notifications on your new companion smartphone app,

00:16:22: to be like you're in Pocket therapist are your Wellness kind of emotional wellness coach I think.

00:16:29: There are some people like yourself that love to look at Donna but as we're already seeing on the on the biometric kind of step side of wearable technology,

00:16:39: people often don't want to refer to their life and charts and graphs only what they want to see our insights they want to be told what to do a actionable data right exam so we really need to get to the point where,

00:16:52: you know things are just happening in the background and then you know you're told your little assistant is is telling you,

00:17:01: you know I've noticed that you're really down in the dumps so I just ordered your favorite you know me Al and Hoover and so I'm you should go see this movie because it's going to.

00:17:10: It's proven to increase your happiness level by 2 points or whatever that fits into your schedule and and girlfriend is also her schedule yet,

00:17:20: you know at the end of the day if we extrapolate where this is all going on why motion Tech is really important is because our technology is such an integral part of our life,

00:17:30: Car Technology doesn't know us and we started to give it,

00:17:33: pieces and glimpses of who we are such as RPGs clicks like a behavior online and then GPS was huge because it allowed for technology didn't know where we are,

00:17:43: Facebook and Twitter started rolling out emojis which are really primitive almost like cave drawing ways to help them understand how you're feeling.

00:17:53: But until we give our technology the ability to know who we are emotionally it really can't,

00:18:00: you can't really be the friend that we want it to be you know when I talk to you you can understand with I'm nervous or if I'm happy because you're really smart machine and get your Moschino on my machine is feeling it's just how we are as humans and that,

00:18:14: we know that that is the exact Tom why we have Do Not Disturb mode on our phone because,

00:18:18: the mission. Her to come your phone don't understand your contacts and that's what they are intrusive the notifications still are really dumb.

00:18:25: And I just wanna see something as simple as like everytime I listen to music don't play the notification sound like that should be an option but it's not but that's a pretty simple,

00:18:37: grievance that I think emotions and contacts.

00:18:40: Can help the liver but this is all in the same role of like your your phone understand me to contacts around you and emotions are one big part of it and they're huge part of it and you asked like where is this space right now so this space isn't in the hands of the consumer,

00:18:54: perseids more on the research office group marketing side of things.

00:19:00: And so for example we invested in and emotion tech company called LightWave with just here in California,

00:19:05: and there a solution is being used a lot by the film industry to help them better understand the reaction of the audience,

00:19:13: so on the lightweight pot farm utilizes wearables why can wearables work in this situation because you can't put wearables on everybody in the theater you also have one big key factor to be able to really understand emotion and that's.

00:19:27: Contact knowing what they are doing you have no 7500 people all doing the same thing I'm going through a lot rolling the experiment anyway,

00:19:35: and so they've been able to work with big studios and there's a great article out there on what they did with the Revenant just recently but,

00:19:43: and really being able to determine the emotional measurement of the audience as they watch the movie to better understand what was exciting but was boring was fearful,

00:19:53: and all this data is lucrative data for The Cutting Room floor for editing for future films and so,

00:20:00: it in anyway this emotional. I can really help make a movie hit absolutely and understanding even down to like who will like what at certain points and how to add a fact,

00:20:13: is a lot of movies is I'm trying to figure out who to cater to what demographic and like how much the balance and so,

00:20:19: you get into those neon says if you do have you know who each person is and how they're feeling.

00:20:24: It's interesting I think this that can completely be opened up to like the YouTuber Community cuz like you're putting out a video it's going to get a million either going to 7 million views or half million views if your,

00:20:36: tomalak PewDiePie and there's a lot of money on the table as to the performance of it in so what if you have a select few off a testers that.

00:20:44: I don't even wear wearables maybe they do but you using the front facing Cam and you're just saying I was actually doing an analytics on.

00:20:53: All through the browser essentially and I think that's like analytics is key because you can use this. In two ways you can listen and learn.

00:21:01: Or you can take this inside and then power and personalize a situation so for example back to that film situation on YouTube.

00:21:09: Right now you can leverage these emotional technology platforms to learn more about your users and then make the decisions and choices in future content,

00:21:20: but imagine if now you are Netflix and you're creating a dynamic kind of Choose Your Own Adventure story that is powered by that person's emotion,

00:21:30: that's kind of like the next step and that's like Windows Defender with a the last if an example lights,

00:21:36: imagine I'm wearing an Apple Watch or an Android Wear Smartwatch or some sort of wristband is gathering my emotion,

00:21:42: and as I'm listening to my playlist the next song is generated based on,

00:21:47: desired state that I want to be and I'm going through a break-up I want to be devastated so feed me a Dell all day long and keep changing things you know whatever it is or the or do I am I going through a breakup and I need to be cheered up so as I start to beer down a path that's making me more sad,

00:22:03: Spotify or whatever system is actually actively being my friend and picking out songs that they know or going to lift my emotions.

00:22:12: Emotional state yeah and I think since each person is kind of different and needs to be a smart system that an orange it gets better over time it at Ewing with your you know,

00:22:22: your tantrum or emotions just like a spouse would I think if you try to generalize your care for them and then.

00:22:29: It gets more more personalized with her and knowing exactly what that's going to do best and that's one of the key challenges right the challenges,

00:22:36: bad we're not all the same even though as you mentioned at the top we have five to seven emotions that we all share and these platforms are gathering enough. Assets to be able to you know.

00:22:47: If I throw a dart into those quadrants with you no high level of accuracy it really will require you to utilize these platforms on a regular basis to really cater and customize them to who you are,

00:23:00: and so is a lot of learning on the personal level then you know your your system will need to in order to.

00:23:06: Get to know you and I ever tell you the story because Honda and SoftBank went on the one on the record last year saying that they want to make an emotionally able to car.

00:23:14: And I reread the article today and I don't know the specific quote but what I loved about it was at Honda said that they wanted your car to grow up with you.

00:23:23: And just think they wanted your car to grow up with you so that you formed a stronger attachment to your vehicle.

00:23:30: Do you have even further loyalty Tory Brandi has been living so just think about that like we do feel loyal and a way to maybe Apple or Android mostly because of the user experience in your car I mean you can put a face that I caught her right but but can you imagine once,

00:23:48: once you're your assistant you know Alexa or Google home or whoever it might be once they not only know your emotions.

00:23:57: But they start having their own emotions right and so we can go out while we are giving them access to how we feel we're also teaching them how to feel on the flip side,

00:24:06: and this is where we can extrapolate things to virtual reality and particular rights in the virtual world,

00:24:12: we are wipe away the real world and we're all virtual characters for for lack of better word and come into with with avatars and and virtual beings surrounding us,

00:24:23: those virtual beings need to emotions some way and these platforms these these Effective computing platforms could and Power,

00:24:30: these characters to allow for them to be able to unload but also allow for you to better understand,

00:24:38: how about real person is feeling beneath the virtual character right so it really is an enabling the virtual world to mimic what's happening in the real world,

00:24:49: I'm because the sensors will you know I'm Avail the emotional layer of that's necessary to be gathered and then be ingested in BR particular I think.

00:25:00: And there's a lot there's a huge bag there's a huge honey pot that opens up with v on air when it comes to and motion and motion Technologies official recognition which.

00:25:11: Is just around the corner I tracking down to like GSP censored answers but.

00:25:17: There's actually a lot of really good things about the our is one thing is that you're holding two controllers so that those are two wearables already that you have in your hand if you actually integrate them would like.

00:25:28: Another the skin response.

00:25:30: Sensors as well as heart rate whatever I'm surprised they don't have that currently but that's probably just cuz they don't want to add a layer of complexity but you can see that coming around but why you wouldn't want that is because you,

00:25:42: it would probably would be on air you you have more points of tracking in terms of like you know where their hands are you know where their head is you know when they're laughing you know when they're sad you know when they're,

00:25:52: like moving in a very particular way or cautious of it you know or think of them in a virtual scene and if,

00:25:58: there's something that's attracting them they're probably taking and I'm there.

00:26:03: Standing in the room more closer to it or their feet appointing it that way and so there's a lot more I guess at points of tracking that you got in VR and AR.

00:26:11: So I'm asking for Matt analytics perspective Valley asking to be a gold mine that's true actually in and cognitive ER which I think you've had on the,

00:26:19: on the other ones robotics companies actually just opened up there,

00:26:28: Pot Farm to be able to leverage other,

00:26:30: data point biometric data points just cook them well like this so I mean I think that's coming in the and the analytic capability especially for the Enterprise to you the usual utilize in a simulation on perspective.

00:26:42: I'm like a medical simulations or just like products emulation sale simulations and firefighting you know dangerous situations situations could really benefit from better understanding emotion and then taking all those learning some bring in the back to the real world.

00:26:57: So yeah I think you're right like it again with emotion text the lucrative opportunity is to listen and learn.

00:27:02: Or to leverage that these insights and actually power whatever whatever the content could be absolutely and Amy is huge to write so the ability for gaming to react to,

00:27:14: you emotionally is something that's already started to be played with with a Unity games effective actually has a Unity SDK plug-in and so,

00:27:24: the you can actually leverage Innovation recognition or emotional measurement.

00:27:31: Be able to manipulate and change your game based on how engaged your user is or how they're feeling the are for example there's been a lot of scare games like a lot of scare tactic games and so,

00:27:44: you could actually set a threshold where it you know you don't want to.

00:27:48: No exceed what's a 10 points of fear them to ensure that there is the safety factor for the end-user or maybe you want to make sure that they reach an all-points out of fear and so.

00:28:00: Guy that doesn't really get scared at all keeps being at a 5 and you keep upping the ante as if they continue I'm actually very close to getting this heart rate monitor that I can,

00:28:10: put on a VR player during in our life streams where.

00:28:13: You have the view of what they're doing and make sure you got a camera and as well as their heart rate and it would just be interesting to see how.

00:28:20: And Elena right and they all interact and swell as well as the audience in the house how did how do they interact as they see heart rates going up,

00:28:27: and I think that's a good point too because the use of emotions,

00:28:31: as content themselves is another opportunity and actually LightWave did this and Wimbleton with Jaguar where they measure the emotional measurement of the crowd and then overlay that on the YouTube video that we're going amazing I mean think about it.

00:28:45: How many how popular reaction videos,

00:28:47: reaction videos are just pure emotion right in a video and then it's into a contest of contest so that's the continents l,

00:28:55: but there's so much to just people all they do is literally they they they suck and other people's emotions and either me or them or or or repel them,

00:29:04: and so there there's so much there let me know when you watch Facebook live I noticed it.

00:29:09: And even some of the brands are putting meters of the Emojis are the likes of this again is his primitive way of us wanting to understand how our people has be laying here while they're watching this and,

00:29:21: we leave the Delight the the the love the well deserve,

00:29:25: these are really not our true emotions this is what happens with social media and this manager was and yes you're choosing how you want to be displayed in the social Paradigm and soap,

00:29:35: what's really amazing but also really scary about emotion technology is that it's showing how you really feel and are we really ready to have a Facebook,

00:29:46: with my shows how I really felt about your picture or your status update and I don't think that's.

00:29:53: Exactly like the way that the erection of we're going to be headed to release.

00:29:57: That's what we all thought initially with the internet psycho you're just going to suppress everything in life loggers they're just going to put every single second of their life I know it's curated is everything that you put out on the Internet is curated,

00:30:08: and I don't think people will want to give that away and this mushy platforms aren't and I want to like,

00:30:13: now we're going to put your picture up and there's actually 30% of your friends actually like and we're discussing feeling to it you know that's not going to be something that they were going to want and courage that users to experience,

00:30:24: so even I think it will probably have like an automatic.

00:30:29: Tracking of your motion and it'll prompt you would like hey do you want a note with a happiness or whatever to it to that means of content on Facebook but.

00:30:38: That's how I think we're ever going to get to the point at least on the front facing side of a of a piece of content what people are feeling,

00:30:47: or lease on a personal level but I think the players will be really hesitant moving that direction when these platforms become even more widely used and their sdks are and play,

00:30:59: I can definitely see a start-up creating a social network that is based upon true motion and then we'll see if that's actually forgot people want to go,

00:31:07: and I think I can a time of fake news for example this this need for and at an orphanage,

00:31:15: this time of transparency in this time of vulnerability is actually quite needed and I think the timing is right even culturally to be able to start to introduce this type of.

00:31:25: Technology but again there are a lot of privacy concerns and then beyond that I think one of the things we haven't really talked about it although many of these startups that we've named are Neuroscience backed and they have many data points that are churning through their platform.

00:31:40: There's still a lot to be learned in emotions and general and then you know what a factiva defines as happy versus other companies,

00:31:50: you know that there needs to be some standardization that mean occur over time as this is a space mature so there's some room for improvement and that's why when it comes to emotions take one of the safest places to look at right now is engagement how engaged is the user right and engagement is a safe,

00:32:08: metric to measure cuz that's almost like an on and off and you might like it here either engaging or I mean.

00:32:13: It's it's obvious you variable but like anything any point of like reaction or data that you can give a video on Facebook as engagement right rather than you'd possibly watching it I guess that's that's an easy one of track.

00:32:25: But what I mean we're what's the next step.

00:32:28: Bear comes to analytics I'm pulling from people but I think engagement is first engagement tells you a lot about your user what does engagement mean I mean what is it what does it mean to you is it.

00:32:41: With a possibly or just and getting with it Petr positively and negatively like that's that's the scope of inside sentiments there but also just understanding when your user is,

00:32:52: and it has like energy behind them like are they are they engage with your content do they are today Calise over or they board with your content like there's a lot of lot of insight to be gained and just,

00:33:04: how your and user is interacting with your content over time so for example and Sports in any sports at setting as Stadium,

00:33:13: you know what were the most exciting plays what were the most boring now periods of the game all these all these insights could be used for,

00:33:24: broadcast but they can also be used for the advertisers in the stadium to understand know when,

00:33:30: section 201 keeps falling asleep at this. For some reason I can't because I can't see something so that you could say well maybe we need to change something in the venue,

00:33:39: or we can start sending Pepsi over to get there in about heart rate up and and just like get them moving and get them excited so started way there even though we may not be able to specifically determine if those people are happy or angry or hurt or sad,

00:33:53: just understanding whether or not they're engaged and what level of Engagement they have could help advertisers venues broadcasters.

00:34:00: Even the people themselves understand kind of what's going on and the content creators to of course you know it's funny this.

00:34:07: Yes you describing people at stadium in understand our standing groups and how to defect with it's this is what we're talking about is like macaroni or science because it's,

00:34:17: similar problem with neurons in the brain that I could try to understand and give it what you can either looking at individual neuron I like a huge grouper neutrons but.

00:34:25: You're still trying to figure out exactly what are the factors as to why they're trying and how they're firing.

00:34:29: This is just you know that's just one brain now you're taking a hundred thousand brains and your and try to understand how they're all interacting together.

00:34:37: I'm so probably similar principles will all rise from each other is about the nature of a physics as it's the same principles from the small to huge scale so,

00:34:49: starting would like an individual person and understanding their emotions and up to a macro scale I think there's going to be some.

00:34:56: Amazing Discoveries that we're going to start seeing soon enough and maybe we're not all that different email them than we think honestly.

00:35:04: Just culturally is Is this different upbringing we won't know unless we actively participate in,

00:35:12: best evolving next wave of computing which would really really does and force us to ask how,

00:35:19: much we value our privacy and and also our data the data that we own and the ownership of that data,

00:35:25: the weather at the motion or at CED in in and mental data or its physiological and biometric data like steps or heart rate,

00:35:34: I'm at the same conversation that we need to have and whether cameras are watching us or whether we put sensors on her body we do need to have a come-to-jesus moment with you know how,

00:35:44: we want to handle this. Of who was only this data how do we trade this data where is it stored what is a security so this at this conversation this privacy security conversation comes up time and time when I talk about wearable tech,

00:35:58: and it is an important one to be had because it actually impacts and I'll just wearables.

00:36:02: And emotions but also a are and I am driving cars because the fact of the matter is is what we're doing is waking up the world be able to see and understand us and if we don't,

00:36:17: let them do it then this the magic of the future Hannah Rio Rancho.

00:36:23: It's I mean obviously it's because of the scale of technology has been growing much faster than the skill of security of that technology and.

00:36:31: We're just at that weird weird Wild Wild West moment where we're trying to figure out how to not let things go out of hand and same thing with AI I think that's why.

00:36:41: We have organizations like opening trying to kind of guide the development in a certain direction.

00:36:49: Tom I mean we're talking a lot about I think idealistic ideas that we want and and why this is important but where are we currently in and,

00:36:58: I'm asking a personal level like d use any of the track like a motion tracking in your in your life to help you make certain decisions have you the currently what can you do if you if you're interested in using this data.

00:37:13: Well there are some devices that you can pre-order or use from a consumer perspective.

00:37:20: That's a measure of motion and I would say that we are they are at on the consumer side is more in Stress Management,

00:37:26: okay so again looking more at your heart rate in determining whether or not you're stressed and then walking you through some breathing exercises to get your down to of more happy or level,

00:37:43: but I would say that when it comes to emotions there's not a lot of activity in the consumer sphere,

00:37:50: there's a lot more activity happening on the Enterprise Al,

00:37:53: how many men only makes sense I think for now then again like that's the focus groups I mean it is probably the more fun example of the use of emotions act again,

00:38:03: the kind of going back to one of light-waves earlier experience where they worked with Pepsi on a reactive,

00:38:07: Concert Experience we are the crowds the level of Engagement and emotions were measured and when unlock experiences as the night went on and to me that's exhilarating right and so talk about like being in a stadium where,

00:38:23: you know you need to you know what I say the Super Bowl over at this Super Bowl crowd needs to reach a level of Engagement in order to.

00:38:31: Allow for Lady Gaga to drive from you that's like a livestream on as it's like a weird we're almost at our early twitch live streams are all about donations and Indians whatever,

00:38:41: I'm almost there and you unlock in and you get the all these things are at the personal live streaming apps like live me Lively they all.

00:38:48: All the big people incorporate that engagement level with keeping things on the screen with keeping gift coming to order them and money eventually.

00:38:57: Into and walking minutes next age that would be interesting about them from added into the real life.

00:39:02: Setting because people are familiar with that concept those crowd engagement so interesting that is really interesting so.

00:39:10: I would say like at 4 the average listener and home that's trying to use a motion Tech you know in their daily life,

00:39:16: it's it still to come there are some apps on Apple watch for example there are some specific companies like feel like mood metric the Nya which,

00:39:27: it's from the UK which I'm not sure if they're still working on that project but there are some companies that are trying to utilize wearables to detect emotion.

00:39:35: And even a mighty debuted Smartwatch app on on the M7 band from Samsung so that there's some Rumblings happening on the consumer side but there is this is really something that you're going to be able to get a hold of.

00:39:50: You know tomorrow on the wearable and what's that wifey what's the what's the next Milestone. We're looking to head is it eyes like I'm tracking cuz I think I just give away a ton.

00:40:02: Emotions reading we talked about voice Asia land wearables I think we're going to see a lot of moving on the facial side if you look at the emotion text basis a lot more activity happening on the facial side and we're already seeing big players like Apple,

00:40:17: make Acquisitions and again we all have cameras we all stare at our phones and we are starting to put cameras in our homes like,

00:40:26: connected at home security systems and we look at webcams on our laptop and so.

00:40:32: I think that the next wave that we will see is to have our,

00:40:37: phones are smartphones allow for emotional measurement based on the cameras that they have.

00:40:43: And them this this can be utilized again from just a listening perspective so you're at the companies that have apps are just not doing anything with it but they're just listening and learning.

00:40:55: Or you can go as far as personalizing the experience because of your emotion and in fact efectiva and efe announced that,

00:41:05: and they're working together to be able to,

00:41:08: to attach Emoji to each of their gifts so that you can search by emoji and then eventually search by just your face I hear your emotions,

00:41:17: tell like imagine search by emotions imagine you know again like Siri understanding your emotion and imagine your emotion being considered to make it quicker for those Emoji's on Facebook as you mentioned so I think that's where we're going to see it,

00:41:31: how the wearables become a little bit more at ubiquitous because the challenges you can get some amazing information from HR vgsr skin temperature,

00:41:40: the kicker there is that everybody needs to be wearing a sensor and we still have not,

00:41:45: one that battle quite yet once we do have for wearable sensors on our bodies whether it's clothing or no or watches are wristbands then that.

00:41:54: A combination of voice face and Biometrics will allow for our systems to know it's even more robustly and that's going to be when we can really unlock maybe a more personal level of emotional measurement.

00:42:08: I'm interested I it and I want to throw a little quick shout out to people that would like to get into space now I should the first.

00:42:16: Exposure part from music that I had to emotions that was sleep to,

00:42:20: and sleep cycle which is an iPhone app that you put next to you originally actually was just something you put on your bed next to you that was using the,

00:42:28: the IMU sensors inside to know how much you're moving throughout your night and and it actually wakes you up in this 30 minute. That you give it.

00:42:36: Weather in inside uses the data that it has about which cycle of sleep at your and whether you're in light sleep or deep sleep.

00:42:45: And it wakes you up in your light sleep actually in that 30-minute window so you don't wake up groggy you don't wake up from like a deep dream,

00:42:53: watching REM it's funny it's on you you don't dream and your deep sleep your dream and like your light asleep at like it's another layer on top of locked which is RNA.

00:43:05: Letter that I am member you also had a mood tracker inside of and so when you wake up and you tell let you know how your feeling and then before you go to sleep it actually ask you if you have coffee your drink tea there's certain things you can check off and then it gives you those,

00:43:17: if you did over couple months you can just see all those things affecting and effecting your sleep or your make wake up mode.

00:43:25: So that was like this is a interesting thing you can just do right now and I use it for a while actually and I would,

00:43:32: and give you a percentage as to how well you slept that night and I have a game if I sleep for me,

00:43:38: and it's great that you mention that explicit information you had to give that app because I also see that as one of the first steps where yeah maybe we have the facial recognition on our phones,

00:43:47: and it will say hey Tom III see that you're happy and my right,

00:43:52: so they're looking some of that teaching I think that will have to do for and order for these experiences to work but,

00:43:57: I should tell you that one of my favorite food technology experiences that I had was using.

00:44:04: A sweater that was made from the california-based company called sensory,

00:44:08: and they have this girl in sweater that has LEDs in the cowl of the sweater and a sweater itself has a GSR sensors that you wear on your hands okay and as you're wearing the sweater,

00:44:19: Advil LEDs changed correspond to what your mood is and also pink and yellow Zen and purples and and reds and there's a full index that I can't remember but,

00:44:28: is anything from excitement to like fear and anxiety,

00:44:32: why do I like this while I love this ability for technology to allow me to show the rest of the world how I feel and a lot of people may feel a little bit uneasy about that but,

00:44:41: imagine if we really were that vulnerable with how we felt and and we allowed for the rest of the world just to see how we truly feel that moment.

00:44:50: I could really increase empathy I think and I could really allow for it no the people around us to quickly have some meaningful connections.

00:44:57: Because inevitably you'll have the people that see the the fearful anxious individuals that are The Confident come,

00:45:04: ones and want to find that balance cuz that's one thing that we know about our world is that we do strive for that you hang up Allen's,

00:45:10: and I think that a device like sensories Kerr boot sweater could allow for that ability to unlock for their empathy for their Kinect connection,

00:45:19: on top of the fact that I think it's really an interesting concept to think that we could have connected clothing that is personalized by or emotions so maybe maybe we don't go as far as since we were you know that you're happy or sad,

00:45:31: but that my clothing that I wear is connected and that clothing is always expressed a dumb clothing right now is always expressed how I feel you know I'm a teenager so I'm wearing like your body don't ravana so now I have like,

00:45:44: no flexible LED.

00:45:47: Luminescent fibers whatever it might be in my clothing and because that clothing is tapped into how I feel the change color it could change format can.

00:45:55: Print up new designs I find that really fascinating it's another example of how if we give technology access who we really are it could really help us not only communicate but also in a personalized sense of style.

00:46:09: Prequel I think I think so too I'm I'll just push back a little bit on the ability to express.

00:46:16: All the prime oceans I think I think people are very hesitant first for a lot of reasons and.

00:46:23: And I'm doubtful that we're going to want to be more raw with with expressing ourselves so and Utopia,

00:46:36: only there's there's a movement of of people trying empathize and end adjusting a hippy dippy whenever you want to go where.

00:46:45: We're surrounded by the minute San Francisco night and I probably one of them I mean I don't like to seem.

00:46:51: My I don't want my life or myself to seem fake and come to my emotions I'm very expressive actually emotions if you hear me on the and podcast.

00:47:01: But I think the internet is still a very curated.

00:47:05: And it and probably will always be like and curated sense and that's what you're virtual avatars will be as well it's always going to be some but you don't seem VR as the chance for us to break that more than and start to actually.

00:47:18: Create an and I don't think so I mean nectar world where we can be ourselves I don't disagree I think my Twitter account is like more me than like my real me honestly but like that's the look curated Twitter account and I think,

00:47:31: the Avatar that you'll end up creating a VR still will be like the.

00:47:35: Maybe not ideal I sense of yourself but this part of yourself that you want to be you or if you want it you know interact with think so.

00:47:42: I don't know there's there's I think there's going to be some experimentation happening with like what you can do with emotions and I'm stoked to be a part of Italy,

00:47:52: and I agree with you I don't think right out the gate we're going to want to have everything that we feel you know put on paper put on the internet right and I don't want everything we think to be documented as well so that's what I'm saying but I do see us,

00:48:06: moving in the Direction Where We have the ability to be much more truthful and and also provide much more transparency not think we need to let go a little bit of these facades that we have created this constructs in the internet,

00:48:20: in order to really connect now you really can't connect with an Instagram account that's only showing you beauty.

00:48:27: You just need and so I think that this technology has the ability to unlock.

00:48:32: but at the same time there's so many dystopian outcomes yellow discriminatory outcomes you know that same situation where I'm wearing a skirt sweater down the street if it's showing people that.

00:48:44: Like what if it's like a really tragic accident that happened in for some reason I have these weird feelings and.

00:48:51: And I always had my grandma Twitter show that I was happy like what with that total isolation so I think it right like they're there,

00:48:59: it's a it's a brand new world that were entering and we're going to see how it all shapes up yeah but what we know is that this technology is going to have.

00:49:09: The capability access pieces of us that has never done before and for your apps to be dynamic enough to understand you and see what your feeling and to change itself to vent to fit you better I think that's not the future that we all wanna have is this.

00:49:24: Better better apps better technology that we use and less frustrations and whether or not we use emotions to get there,

00:49:33: especially now that we will use it but she got there I know whether or not we have it expressed out to the world we'll see thank you Tom emrich for joining us again thank you,

00:49:42: another researchvr your podcast this was a fun one not talking about VR specifically but about the technologies that arise around VR and I can integrate into VR and AR,

00:49:52: I'm sure we'll revisit this idea just this like that last time we talking about I meant adhering and hereabouts.

00:50:00: If you haven't heard that episode go back I think two episodes and you'll hear it with Tom emrich the thank you Tom again thank you so much.

00:50:08: Music.

New comment

Your name or nickname, will be shown publicly
At least 10 characters long
By submitting your comment you agree that the content of the field "Name or nickname" will be stored and shown publicly next to your comment. Using your real name is optional.