EPISODE 1739 [INTRO] [0:00:00] ANNOUNCER: The Humane Pin is a multimodal wearable device designed by Humane Inc., a startup co-founded by former Apple employees Imran Chaudhri and Bethany Bongiorno. This wearable is part of a broader vision to create more seamless and integrated interactions between humans and technology, moving away from traditional screens. George Kedenburg III is a software designer at Humane and Josh Dickens is a software and product designer at Humane. They join the podcast with Sean Falconer to talk about Humane and the technology the company is developing. This episode is hosted by Sean Falconer. Check the show notes for more information on Sean's work and where to find him. [EPISODE] [0:00:51] SF: George and Josh, welcome to the show. [0:00:54] JD: Hi, thank you. [0:00:55] GK: Thanks for having us. [0:00:55] SF: Yes, awesome. Well, thanks for coming back here. This is attempt number two at this. In full transparency to the audience, we actually recorded an interview about a month ago in person at the Config Conference in San Francisco, which was great. It was nice to actually do this in person. But unfortunately, due to circumstances out of our control, we lost that interview. Very unfortunate since I would say, that had to have been easily the best podcast in the history of podcasts. So, we'll just have to see if we can try to recreate the magic today. [0:01:22] JD: Live up to the enormous standard that we set. [0:01:24] SF: Yes, exactly. Well, let's start with an introduction. George, who are you? What do you do? [0:01:30] GK: Sure, yes. So, my name's George. I am a software designer here at Humane. I've been here about two and a half years. As of lately, mostly been focused on AI stuff and exploring LLMs and models and how that works within our product, which is very AI-heavy. [0:01:51] SF: Awesome. And then Josh, over to you? [0:01:53] JD: Yes. I'm Josh Dickens. I'm a software designer, product designer here. Most of my focus has been on the kind of user interface, as it were, of the AI Pin. A lot of that is the laser display that we use, but also really thinking about the entire end-to-end multimodal interactions, whether it's using your voice, the laser, the touchpad, and kind of everything in between. [0:02:17] SF: Yes, awesome. I definitely want to get into all that. I mean, I think things get really complicated when you start to talk about multimodal, both on the, I guess, on the AI model side as well as just from a user interaction standpoint place. But maybe we can start off with a little bit about the background on Humane. It seems like a bunch of like really smart, great people from ex-Apple and other places known for design kind of just like came together. We're like, "Hey, let's start a company and build some technology for the future." I don't know, is that like a fair assessment essentially like what happened there? [0:02:51] GK: Yes. I mean, I think that seems pretty accurate. I think, yes, it's a lot of people who kind of feel like maybe technology isn't going in the direction that it should be going. Maybe there's less emphasis being put on actually making things easier and simpler. What if we worked on some things to maybe push things in the other direction? [0:03:13] JD: Yes, I think one of the things that appealed to me about Humane before I joined, and when I joined, it was still very much in stealth, so I didn't really know much, if anything, about the actual product. But the vision that I heard from Imran and Bethany and other people at the company was this idea that everything is becoming compute in a way. Everything is getting compute added to it. Everything is going to be affected by software. But today, our way of kind of interacting with all that is through screens. There's one side of the kind of tech world that is really pushing us to think about putting those screens directly to our eyeballs. Another side, and I think what Humane's vision from very early on was, what if compute was more invisible and you didn't have to use a screen to benefit from all these great advances in technology? [0:04:09] SF: Do you think, in terms of your decision-making process with joining a company that you didn't really know necessarily what they were doing. It was still in stealth. How do you make a decision like that? Is it based primarily on, like, "Hey, I really believe in the founding team. These people are incredible. So, it doesn't really matter what they're working on. I'm sure there's something cool happening there." [0:04:28] GK: Yes. I mean, it seems maybe silly to say, but it was really just vibes. I think I also didn't know what I was going to be working on before I joined, but I remember just having a lot of really good conversations as I was interviewing and just the things we were talking about and other people hear their viewpoint on those things. Everything just really resonated. I remember thinking, what's the worst it could be? As long as we're not making like an NFT watch or something, then it's probably going to be something cool. I definitely think that like something that I've realized throughout my career is just that, you can be working on the coolest product ever, but if you're working with people that you don't really connect with or that don't really get it the same way, then it doesn't really matter because the work isn't ever going to be that good. I just put some trust in the fact that the people seem great and they seem to be motivated by the right things. We'll see what's on the other side. [0:05:25] SF: Do you think that given there's a like, maybe this is true of all, like early stage companies in terms of hopefully they have like an ambitious vision, but a company that has an ambitious vision is maybe a little bit different than what some other companies are doing. Is there unique challenges that come along with that of having such sort of ambitious goals? [0:05:45] JD: Yes, absolutely. I mean, I think one, when your goals are really ambitious and you kind of set a high bar, anytime you fail to achieve that, you're setting yourself up for a lot of criticism. It's much easier to just go after something very familiar. I could have gone to work on another app, having worked on apps most of my career, and it goes without saying we had a pretty rocky launch when we launched the product. But in my mind, like the opportunity to be a part of something that was trying to do something big and for many of the ways that we did fail, there are a lot of ways that we succeeded and we've continued to kind of iterate and make things better. Just as an individual, that's an opportunity to grow and learn, and that's something that, you know, I wouldn't trade for anything. [0:06:33] SF: Yes. So, the first product that you guys launched was a little over a year ago, the AI Pin. Can you give a little bit of background on that for anybody who's maybe not familiar with the product? What is it as a user that I'm doing with essentially the AI Pin? And how does that work? [0:06:48] JD: Yes, so the AI Pin is essentially a wearable AI assistant that's meant to use AI to kind of harness the power of AI to let you tap into the things you might use, your phone or your computer for throughout the day without having to pull out your phone or your computer to do them so that you can be more present. It has more contextual awareness so it can like see what's around you and help you achieve tasks in the real world. [0:07:14] GK: Yes, and maybe, it's worth talking about the form factor a little bit. It's a wearable device. It magnetically attaches to clothing. You can also use it in your hand if you don't want to put it on, which sometimes when I'm at home or something, I'll just like, have it in my pocket and use it handheld. But generally, it's attached to clothing and where it sits, it has a similar field of view to what you see. So that sets us up to really take advantage of these AI vision models as they get better and faster to be able to offer you contextual awareness, not just like taking your voice as input, but taking your voice and what you're looking at and all of these different pieces of context and using those to make smarter and better decisions about how to handle your request. I personally resonate with the form factor a lot because sometimes I like to wear watches that aren't smartwatches. Sometimes I have multiple pairs of glasses that I like to mix up. For me, those slots, if you think about, you know, yourself as a video game character, that like those slots are sometimes occupied for me in ways that I don't really want to change my behavior around. So, it's nice that I can just sort of put this thing on every day. It doesn't really - I don't think about it that much. It doesn't really take a lot away from how I go through my life. But then it's there whenever I need it, whenever I have a question, or whenever I want to do something, it's just right there waiting. [0:08:35] JD: Yes, the other thing I just mentioned, because it's one of my favorite things about the AI Pin is it's just a great hands-free camera. You can take photos and videos with it, which they have a different quality than if you pull out your phone and trying to take a picture of your kid. As soon as the phone comes out, your kid runs away as my kids do. So, this lets me kind of capture those memories in a more kind of seamless way and that's pretty valuable just in and of itself. [0:09:03] SF: Yes. I think the challenge that I run into with my phone is if I'm like walking with my kids, my kids are very young, I'm pushing a stroller or something like that, and maybe I'm listening in one ear to a podcast for research purposes. Then I want to take a note, then I got to stop the podcast, pull out the phone, stop the pushing of the stroller, take the note, and then start that whole process again. It's a pretty intrusive, disruptive process. Maybe I see a Slack message on my phone that I need to answer for work or something like that. So, then something that should have literally took in like five seconds to just jot down an idea, I've now been distracted for five minutes and my kids upset. So, having a less intrusive process to do something like that sounds pretty magical. [0:09:45] GK: Yes, I think you nailed it. There's so much friction and there's so much opportunity for distraction with the devices that we have today. I, as someone with ADHD, really appreciate being able to just ask a question and get an answer without having to like wade through the like time squareness of my phone, because yes, very often I pull my phone out to do something and then 10 minutes later I'm like where am I? How did I get here? What was I supposed to be doing? That's sort of like efficient access and it's not trying to do anything other than whatever you're asking it to do. So, it's never attempting to derail you or give you other things to think about. [0:10:24] SF: Yes. Do you see this as replacing other forms of technology? Or is it more like better enablement of use cases that maybe are not that well served by existing technology? [0:10:36] GK: Yes, I really think about it in terms of like just offering people more choice. I think that like today we have a lot of choice in form factor of like what size of a screen we want, but we don't have a choice of like, do I want a screen or not? There is a lot of focus and efficiency that comes with just simplifying everything down to a conversation. That it's just, it's a very streamlined thing and you can be very efficient about it. I think that that choice sounds pretty great. I think it's easy to paint this like a smartphone killer picture that we're trying to replace all the devices in your life with this little Pin that you wear on your clothing. Obviously, that is not a realistic goal. The same way that like our phones didn't replace the laptop computers that we have. Those didn't replace the desktop computers that we have. Yes, nowadays, you can just like live your life with an iPad and you can kind of get your stuff done. But those other form factors still exist. They are still useful in certain ways, but our relationship with those has changed because of the choice that we're offered now. So, that's how I think about it. It's just like, giving people more choice. [0:11:49] JD: Yes, it's complimentary more than a replacement. [0:11:51] GK: I think definitely, the vision is one day, but like, yes, maybe you can go out with just a Pin on the same way that like now you could go on a trip with just an iPad and not really feel like you miss your laptop very much. But we clearly have a lot of work to do before we can get there, but that's the journey that we're trying to take. [0:12:10] SF: It seems like, if you look historically at a lot of technology, we've kind of, like humans have forced ourselves to like learn how to use that technology or introduce that technology in our lives. It seems like one of the ambitions of the AI Pin or the sense that I get is instead of making a human to sort of adapt themselves to the technology, let's create technology that's sort of adapted for the human and the way that someone wants to operate. So, with that, given that you need it to kind of be unobtrusive, maybe sit on someone's shirt, probably not be too heavy, all these types of things. How does that factor into the design considerations that you have to make? Some of the unique challenges that you have versus maybe traditional technology that people are used to building? [0:12:53] JD: Yes, I mean, I think, caveat is we did not work on the hardware aspects of this, but I think that we've learned through designing the software for it that that form factor and the kind of human factors considerations are really critical. There's things that were designed early on. So, for instance, this is mostly - it's like a voice-first interface, right? You use your voice, which is great in a lot of contexts, but it's not great in some contexts. So, you might be at a loud bar or at a very quiet place. In either of those situations, using your voice maybe isn't the best interface. That's where having this kind of multimodal interaction of having the laser display that lets you bring up a visual interface when you need it without having to rely on it is really crucial. Then, I think, yes, I mean, you mentioned just like the amount of engineering that went into packing this tiny little device with everything that a cell phone does. Plus, a 3D depth sensor and a laser display is pretty remarkable and have it like still look like something you'd want to wear. You feel somewhat fashionable and fits in with your look is a lot of the considerations. [0:14:09] SF: I know you didn't work on the hardware, but I'm just curious about even your experience there or your thoughts on, balancing sort of the aesthetic of the devices being something, you're wearing it. It's kind of fitting in like jewelry in some sense, right? But you also need to fit in all this hardware there. So, I would think that there's some tradeoffs that have to be made in terms of like, "Hey, we need it to do this, but maybe adding that thing makes it not look so good." [0:14:34] GK: Yes. I mean, I think there's probably no shortage of things that we wish we could have put in the device and just for the reasons you stated of just size and space and also power and thermal considerations, it's been really interesting for me coming from the app software world that currently exists is like you don't ever really think about any of that most of the time. You're not thinking about what power impact is my app going to have on someone's phone or on their computer. When you're designing an operating system, suddenly you have to care about all these things. Every decision you make becomes very important in the grand scheme of things. So, yes, I think there's a lot of the vision that's been left unrealized just because at some point you have to decide to cut things and to make something that we can make now. It'll be interesting to see where this all goes in a few years, for sure. [0:15:25] SF: What were some of the technologies that were involved in actually making the AI Pin possible? [0:15:31] GK: So many. [0:15:33] JD: I mean, I think early on Imran and Bethany when they founded the company had this belief that the capabilities of machine learning and AI and computer vision were going to unlock new types of interactions and new types of compute experiences. What's been amazing is they made these bets seven years or - [0:15:58] GK: Yes, somewhere. [0:15:59] JD: Over five years ago and many of them have paid off in a way that like, we have these large language models that are incredibly powerful and getting smarter every day. I think before we shipped the AI pin, many of our functions were built using OpenAI's technology, among others. Going from, say, GPT 3.5 to GPT 4.0 is like a massive step change in the capabilities of what this device and others can do. Being able to have like a 3D depth sensor, like you might have - Microsoft had the Connect years ago for the Xbox. It was this big bar that sat across the top of your TV, and to have an equivalent type of technology in this tiny package that can understand your hand and the gesture it's making is another core kind of technology improvement and advancement that's made this product and this form factor possible. [0:16:59] GK: I think it's also the smallest laser projector to ship in like a real product, which I'm sure was a lot of work. I know from some experience that just even figuring out how we can get it to run for more than like nine seconds without totally overheating the device was like all of that stuff what was very interesting and challenging to think through and explore. But yes, it's pretty crazy. We're doing things today that a year ago we thought would be impossible. I think that just goes to show the rate at which all of this technology is advancing. I mean, a lot of times I say that I feel like this product is going to change every six months and change for the better and sort of drastically be different than what it was capable of six months before that, just because it's really set up to take advantage of all of these amazing advancements that are happening all the time. It's on us to just figure out which one of those are worth our time and make a good experience and package those up for our customers and get them out as soon as we can. I think, it took us three days to switch over to GPT 4.0, which was like amazing to me. I still can't believe that we were able to do that so quickly. So, we're just going to try to keep doing that and keep making the product better. [0:18:11] JD: Yes, one of the coolest things I think about AI Pin is sort of our operating system, which we call Cosmos. There's device-side intelligence, there's cloud-side intelligence, and then there's all of these services that the AI can choose to use. This AI bus that can go off and perform searches on the web or get the weather or do calculations. We can add functionality to that without having to ship a software update, anything that the user actually has to wait to download. The cloud can just get smarter, which is great. [0:18:44] SF: Were some of the thought process behind building your own operating system, in part due to some of the hardware constraints that you're under since you presumably have limited CPU power, you have limited battery power? Every bit matters, essentially, so you have to enroll your own operating system to fully optimize that in the end and then make decisions about what's happening on device versus what's happening in the cloud? [0:19:07] JD: Yes. I mean, I think you said it well. I mean, having control over the full stack and deciding which things are cloud-based. I think, like, that decision of doing a lot of the intelligence in the cloud, I mean, has really paid off. It's another thing that means that this device can get smarter without a user having to buy a new device in a year or two. The hardware won't obsolete itself just to take advantage of these new technologies. [0:19:35] SF: What were some of the design decisions that went into making this like a self-contained device versus something that is paired and synced with your phone? [0:19:43] GK: I think it's really, back to that more longer-term vision that I think that it's important to us that people perceive this as a standalone device rather than as a companion to something else, because hopefully one day you will be able to just go out in the world with this thing on and not feel like you're missing something. If we had maybe taken a different path from earlier on and say, "Shipped an app with a device that you used to set it up or something," I feel like we would have used that connection as a crutch to not have to solve some of the harder things that we had to solve to make this thing work completely on its own. But now that we are there, I think it's exciting to imagine how we can start to go the other direction a little bit. Something that we're working on at the moment is the ability to pair your phone with your Pin and have the Pin be able to consume all your notifications from your phone and do smart stuff with them, summarize them, rank them, all of that kind of stuff. That stuff's pretty exciting. I'm excited for us to continue to look at what it looks like to work with these other devices more now that we've established that, like, this is its own thing. It doesn't require anything else. But, hey, it might be kind of nice to use it in conjunction with some other things. [0:20:57] JD: I think there are some customers who want to leave their phone behind today and who are able to. The Pin actually does provide the right functionality that they need. It's all about re-evaluating your relationship with technology. I think a lot of people are going through that these days of like, I spend too much time scrolling on TikTok and I spend too much time waiting through my notifications. This idea of a device that kind of help you be more present, it lends itself to let's just make it a standalone device so you don't have to rely on your phone. We know your phone's not going away, but it's not a requirement. [0:21:39] SF: I think especially coming sort of out of the pandemic, at least I see from even like within friends and stuff like that, a growing trend to try to have a healthier balance between technology and non-technology essentially, breaks from social media, breaks from news, breaks from screens and so forth, like pre-programmed into their week and stuff like that. Because I love technology. I mean, I wouldn't host a podcast focused on technology and engineering if I didn't. But too much of anything's maybe not healthy. [0:22:08] JD: Yes. I think, I'm in the same boat. I love all the gadgets and all the things. Yes, for me, the pandemic was a real kind of awakening to, wow, we miss so much when we're disconnected from people and you start to see all the little ways that we stay disconnected because of all these other intrusions into our attention. So, I think that's what appeals to me about what the Pin can do for people. [0:22:38] SF: So, given that there's no like - you can speak to it, interact it with it that way, if you need to see something you're doing essentially like a laser display on say like your hand or something, how do people actually, when they get a pin, configure it? How does that setup process actually work? [0:22:55] GK: Yes. There's two parts. When you order the pin, you provide some information on the web. You can connect your Google or your Apple accounts so it knows who your contacts are. We're working - I think we're testing like Google Calendar support right now. So, you can do a lot of that stuff even before your Pin gets delivered to your door. Then when you get it, the main thing that you have to do is essentially, well, one, turn it on, learn some gestures because interacting with the laser is this one-handed gestural interface. And then you put in a passcode. Because it is a standalone device, because it has a cell phone connection, for most people, that's it. You're ready to go because you've done kind of the heavy lifting before you got the device. Now, you can connect it to Wi-Fi if you need to, if you're in a poor service area for whatever reason. But essentially, it's like, learn the laser, put in your passcode, and start asking it questions. [0:23:54] SF: Okay. What were some of the design cycles like? How did you go about doing user testing and so on before the launch? Was it primarily people within Humane that were testing or were using people outside of the company as well? [0:24:09] GK: I mean, it was a really interesting process, when two years ago, two and a half years ago, we didn't have devices. We didn't have hardware to use. It was still in the process of being invented. So, I remember joining and starting to work on things and it was very exploratory, probably for the first year, maybe year and a half of just we knew that we had all these things in flight, but we didn't really know where they were all going to end up. We didn't know what kind of laser fidelity would we have? What kind of input method would we have? Where would AI be? Like I said, the stuff that we're doing today, we didn't think we could do two years ago. It's because we just didn't really know where a lot of this stuff would end up. There was a lot of exploration. There was a lot of just trial and error internally of just we'd design out the system and we'd get to a point where we'd hit a wall where it's like, "Oh, this doesn't feel very good. Or this feels too complicated." This isn't a thing that we actually want to exist. Then we'd blow it all up and start over. I say start over, but you blow it up and you have all these pieces left over and you start to reconfigure them in a different way. Every time we did that, we learned a little more about what felt good and what made sense. So, it was just a lot of that until a certain point we were able to have devices. Then that was a whole different journey of bug tests, bug fixing, and QA testing, and all that kind of stuff. But, yes, it was a very ambiguous couple of years, for sure. [0:25:39] JD: As a designer who's like I mentioned, mostly worked on apps for his career, to work on not just new hardware, but new hardware with all new input mechanisms and everything, before any of it was really - the hardware was essentially a known quantity. But the software that runs the hardware was still being brought up. So, the interesting challenge for our team was how do you solve some of these problems before the hardware is ready to actually validate them. We did a lot of prototyping in other tools. George and I are big fans of a tool called Origami Studio, and we did tons of prototypes using phones as like a stand-in for the AI Pin and the laser interface, just because that was a known quantity to us. But you just really also have to kind of keep thinking outside of the box of what a screen can do because they're just like totally different constraints. We would take those prototypes - most of our testing for a long period of time because we were still very stealth mode was internal. Now, it's great that we're publicly out there. There's products in the field. We have a great contingent of beta testers. We have this really avid Discord community who gives us feedback and like we launch new features to them and we learn from them too. That's been really exciting to me, is like just learning what are the problems and things that people are actually excited about because you work on this stuff and it's just a lot of guesses. Until you get into people's hands, you actually don't know. [0:27:15] SF: You mentioned earlier that when you did do the launch of the AI pin, there was some criticism and negative feedback that came in from that. I guess, how did you incorporate that feedback into new versions? Did you change anything in terms of your thoughts around to design and launch of future versions? [0:27:33] GK: Yes. I mean, I think there's always, especially in these like zero to one phases, there's like things that you think are going to be really important that end up not being as important as you thought. The inverse of like things that you didn't think were going to be important that actually ended up being, "I think I got those right." But you never really know until you know. I think that like while it was painful to go through that process, it was also really kind of a gift to just have a very clear list of like, "Hey, here's all the things that we need to fix and here's all the stuff that we need to improve on." I don't know. It was kind of awesome to just watch the team really galvanize around that. We just went heads down for, it felt like years, but it was, I think, a couple weeks. [0:28:14] JD: Six weeks, maybe? [0:28:14] GK: Six week. We just hammered a lot of stuff out. I think that was really inspiring. [0:28:20] JD: Yes. We took all the different reviews that came out and as well as a bunch of the feedback we're getting like from our Discord community. Put it in a big spreadsheet and everyone just like formed teams around solving problems around using the laser or battery issues or whatever it was and just like went heads down and went to fix it. We released this what we call 114. It was kind of our maintenance update and it really felt like a different product from what we launched. That's, I think, yes, there's a lot of lessons learned. But in a way having that kind of singular focus was really good for the product. [0:28:58] SF: Yes. I mean, it's hard to design something, especially something new, like completely in a vacuum, right? There's the Reid Hoffman quote about no product survives his first encounter with real users. Whatever you think the path is, like someone's going to go in a different route. I'm curious, have you seen from users of the pin, maybe in the Discord community or otherwise, completely different uses of the Pin that you never would have thought that you're now of trying to actually productize? [0:29:25] GK: Definitely, for sure. I would say that we spent a long time thinking about this stuff before it came out. So, I feel like all of this stuff were like, "Oh, yes, we know we wanted to do that for sure." I think more than anything, I've just been surprised by the ability for people to figure out creative ways to do things with the Pin that we didn't sort of anticipate. I think one example is we have this feature called notes, which is essentially just like an easy way to store and recall information. We always imagined it would be the kind of thing that you use for, "Oh, my wife likes this kind of food. Or the door code to this place is this." But we had one, or not a one, a few Discord users or a few community members who figured out ways to, first, they've reverse engineered our like web APIs, so they could like interact with notes programmatically. Then they figured out that you could basically write and read from notes like it was a database. So, I think, we had people that were pushing their Tesla status into their notes so they could ask their Pin what their car's charge level was. Or I think there was one user who was like using notes to back his D&D characters, and he had notes for all the different characters and could like modify them just through voice of saying, "Oh, yes. Add a hit point to this character, whatever." That kind of stuff, I think, was not super surprising, but just very cool to see the ways people are pushing what we've - just the very limited things that we've offered so far, and it's definitely inspiring as we go forward and think about what does an SDK look like for this kind of product? What does it look like to be someone who's building things for the Pin? [0:31:02] SF: Yes. I mean, it sounds like they basically hacked an API so that they could build their own app experiences, which, I mean, is a really good signal for future product investment. [0:31:11] GK: Definitely. [0:31:12] JD: Yes. Listen to what your users are hacking is also a good. [0:31:17] SF: That's not even merely like, "Hey, I think this would be cool type of suggestion." They're like, they actually did the work to - [0:31:21] JD: Yes. You're like, "Oh, it validates - what? Yes." Literally, that's like a prototype that we don't have to build and we actually understand that people want to use it, so it's great. [0:31:32] SF: Josh, given your previous experience designing for apps and then you're moving into this world of you know hardware multi-modality, did that change anything substantially in the way that you have to go through design cycles and design a product or design a user experience? [0:31:48] JD: Well, there's a lot of things I've learned over the process of designing for the AI Pin that a lot of my habits of ways I approach design problems when I have a screen and the fidelity of a screen, I have to rethink them. A lot of our early design iterations, we just had like way too many buttons. So, part of the design process which was just really stripping away things from the UI because part of it is just the fidelity of the laser. You can't see like fine details. It's a single color. That makes it a different kind of type of interface to interact with. Then it's this gestural interface, which is, it's not direct manipulation like with your finger. I think the other bigger kind of mindset shift is thinking about things as multimodal. How do I do it with my voice? How do I do it with laser? How do I do it with touch? A great example of that is the default way that you unlock your AI Pin is with this hand gesture where you move your hand in and out, kind of like you're playing a trombone. And as you move your hand in and out, you highlight different numbers and you pick to enter your passcode. We also use this to dial a phone number. It was a really great solution for the fidelity of interaction that we had when we first launched. But it had some major drawbacks. I mean, one, it's time-consuming as a way to enter numbers. If you're in bright sun, the laser is a little harder to read so you sometimes wouldn't be able to unlock your AI Pin because you couldn't tell which number you were picking. So, we rethought kind of how you could unlock your AI Pin with something we called touch code, which we're like testing now and shipping soon, which instead of using a gesture in the laser, you just use the touchpad and you draw a little gesture on the touchpad to unlock it. So, that's one part of a different modality of interaction, the gesture on the touchpad. But also, to set that up, you have a whole voice flow that guides you because the whole point of this feature is to not use the laser. So, you need to think about how do we tell the user, "Okay, draw a circle. Now, draw your gesture," and walk them through using voice in a way that feels natural and doesn't feel annoying, because sometimes being talked to by a computer doesn't always feel great. So, finding that balance has been an interesting mindset shift for me as a designer. [0:34:18] SF: With voice, if you look at some of the traditional voice investments in like IoT space, I'll say the G-Home and the Amazon's Alex product, so they don't set off any devices. But I think where some of those devices that kind of suffered from is they haven't really had necessarily like a killer app experience. They end up being kind of like turn on and off the light, set timers, and those are like the most common use cases. Do you think or do you know for AI pin, is there a killer experience? [0:34:48] GK: It's a tough question. The reason that I struggle with it is that I don't put it on every day because I anticipate some situation where I'm going to need it. It's more that I put it on because I don't know when I'm just going to have a question about something, or when I'm just going to like see something that I wish I could take a photo of very easily. Currently, I think it's more of like, I think of it as kind of like a safety net rather than like, "Oh, I can't wait to go to this coffee shop to use this app or something." It's just like it's there and I don't know when I'm going to need it, but I know that I want it around for when I do. I think that in and of itself is kind of hard for people to wrap their head around, especially in this industry that we're in that's so app-focused and obsessed and just like sort of trying to figure out like what's going to give us a bajillion users in a day and how can we make this thing so cool and interesting. It's like, we're not trying to do that. We're trying to give more than we take from people and not be this thing that demands your attention all the time. So, yes, I think that's kind of how I think about it. I think personally, I can't wait. I hope we do a Waymo integration sometime soon because I would really like to be able to call a car. I think that is one thing that I would use a lot. But, yes, I don't know. It's just, it's more about having that intelligence there when you need it. I guess that's the killer feature. [0:36:15] JD: Yes, think to me, the feature we call AI mic, which is, in some ways, it's just like asking questions and doing searches or does some things that are familiar if you've used one of those home devices. So, you can ask for the weather. Sure, that's something I do all the time. But all those random questions my kids ask me through the day. I'm like, "No, I don't know who scored the most three-pointers in 1997." I can just ask those questions. Like you mentioned before, like I can do it without getting distracted. [0:36:47] SF: Yes, you don't have to put a screen up between you - [0:36:48] JD: I don't have to intermediate it with a screen. To me, that is the killer feature in terms of like getting access to the information that exists on the Internet without getting distracted by all the other information that exists on the Internet. [0:37:04] SF: Yes. I think that's a good point. Even if you think about people experiencing things like in real-time, in the real world, like a concert or something like that, then we enjoy that experience so much that we want to like capture it in some way. But then we end up sort of living it through our phone because we're like having in front of our face in order to capture it. There's definitely, that doesn't feel like an ideal situation. So, if you can live in the moment experience it while also capturing it in a non-intrusive way. You're sort of having the technology work for you rather than you're working for the technology. [0:37:35] JD: Yes, exactly. I love just when I've gone to concerts recently, just capturing little videos while I'm there. Because I'm still in the concert. I'm watching it. I'm not thinking about like, "Let me zoom in." I'm just like experiencing it. Then after the fact, I have this very like raw kind of point of view video of what it kind of felt like to be there mixed in with the people. I think something, the dirty little secret of all of those concert videos is like, nobody wants to watch those. [0:38:08] SF: Yes. Well, and that's the thing is like we capture all these things and then we never like actually revisit it or look at it. [0:38:14] JD: I mean, it's like proof that you were there and proof to yourself, like your future self that you were there. There's value in that. I think that's important for people. That's why I still like to capture things with it. But it gives me kind of, like you said, it's the best of both worlds. I get to be present, but then get to kind of have that memory to relive it, remember it, and be like, "Oh, yes, that was a fun night." [0:38:36] SF: Do you think that there's certain innovations that has to happen in order for wearables to become more commonplace? To get to a place where they are more present like a smartphone? [0:38:46] GK: It's a good question. I mean, I personally feel like I'm biased, of course, but I wear this thing every day. It doesn't bother me. It doesn't feel like a chore. It hasn't gotten to it where I'm like, "Oh, I got to wear my stupid Pin because I'm going to work and I don't want it." It's just easy. I just throw it on and it's like not a big deal. I think for us, the challenge is more about just making it more useful, making it - finding more utility to put in this thing so that more people feel like it's a thing that they need or that they want to have throughout the day. I think every little feature that we add will make it click for another subset of people that are like, "Oh, I didn't get it before, but now I get it because you have this thing or you have this other thing." So, we just got to keep doing that, I think. And hopefully, we add enough utility that you start seeing more people with pins on. I don't necessarily know if there's like, sure, it'd be great if it was the size of a button and had an infinite battery life and all of that stuff. But it'll just keep getting smaller and better in that regard over time. I feel like it's in a pretty good - I don't think that's holding it back, I think. [0:39:57] SF: So, we've talked a little bit about how the Pin uses AI to do some of the features or provide some of the functionality that it provides. Clearly, AI is having more than a moment right now. It's changing the way people do all kinds of things like engineering, more and more people relying on co-pilots and help the right code. How is AI actually changing the way that you design products? [0:40:20] JD: Well, talking about all those prototypes we build, I mean, a big part of how we've been using it, and you can talk, we're designers, we're not engineers per se. We have a little bit of - we dabble. But being able to write code with the help of AI has radically accelerated the fidelity and the depth of understanding that our prototypes allow us to experience and feel. I mean, George can speak to this more than I can, but the kinds of things we're able to, proof of concept with the help of these AI co-pilots and things is just scratching the surface. I think there's a whole other realm of using AI to sort of explore problem spaces on your behalf. A big part of what you do as a designer is think about the 10 different ways you might design a thing and then pick the one that's good and AI can be a useful way to explore those 10 different options before picking the one that is actually good. I don't know. Other thoughts, George? [0:41:23] GK: Yes. I mean, I think it really has just like infinitely raised the ceiling of what we as designers can pull off on our own now. I think almost all of the prototypes I've made over the past two years, okay, I could have made them before, but it would have taken me a week to figure out how to do X, Y, or Z in some language that I'm not super familiar with, or debugging a thing that I don't really understand. So, the ability to augment my coding abilities with the help of these tools has just made it possible for us to make things that again, we could have made them, but it probably would have taken us a whole year to make some stuff that we've made in months and that's been pretty great. [0:42:07] SF: Right. So, you can essentially move faster to create like a high-fidelity, more realistic prototype than you could have otherwise. Maybe you could have built it, but the cost-benefit might not have been worth it. So, then you're maybe stuck doing something that's a little bit more lower fidelity. [0:42:24] GK: Yes. I think that sort of has this snowball effect almost of like as we're making stuff, we don't really know what these tools can do in the first place. So, we try something and we maybe get something working and we're like, "Oh, crap, that's crazy. We got that working." The bar just gets higher every time because you stop hitting this wall of like, "Oh, well, I tried to make this, but I didn't finish because it got hard and I had to give up." There's a lot less of that, which means that every time we go to make something, we can be a little bit more ambitious than we were a time before. At some point, we had basically the entire AI Pin operating system built in some combination of like Origami and iOS app and like a Python thing running on a server. The thought of making that two years ago, like obviously we could have done it, but like it would have seemed like an insurmountable thing that like let's just find an easier way to do this. [0:43:18] JD: But let's just fake it. [0:43:19] GK: Yes. Instead, we were able to make it pretty real. It was probably some of the most high-fidelity prototyping I think I've ever done of just like, there's no edge. Normally when you prototype, there's like, "Don't cross this line, or don't go past this point because the prototype will break." Or like, "Oh, don't say this kind of thing because we can't really handle it yet." That is just gone. It does all the things we know it can do, and then it also can just like - we had one example of this in this prototype I'm talking about is like you could ask it what the weather was. Obviously, we didn't have like a weather endpoint hooked up at the time, but we just used GPT to like make up some weather forecasts. So, you could ask for the weather for any city and we would make up what that would look like for you. It felt real. It felt like, "Oh, is this working? Is this like real?" But it wasn't. It was all fake. But it didn't feel fake anymore. [0:44:10] SF: Yes. Even like the utility around like faking data that seems credible is really valuable. I do that all the time in some of the things like demos and prototypes that I build. You just need to see the database with some realistic-looking data. Now, you can just go and auto-generate that versus having to write a script and maybe a bunch of functions to basically like generate something that seems realistic. Just completely shortcuts to the time investment that you have to do. What's next for Humane and AI Pin? [0:44:38] GK: Great question. A lot. I mean, obviously, we're going to continue, like I said, making this thing more useful and finding more things for it to help our customers with. So, I think that's a big area of constant exploration. I think, we're also working on what the developer experience looks like for AI Pin and how we can sort of open things up and make it easier for companies and customers and whoever to sort of integrate into Cosmos and the AI bus and be able to participate in the platform. So, I think that's exciting. Then, we are thinking about new products and new stuff to do and exploring what do we do next and where should we go from here? I think it's a lot of exciting opportunities and things we could do. So, we're just curious to see where that stuff pans out. [0:45:30] SF: Yes. I mean, how do you balance all the things that you could do with what the things are reasonable to do? [0:45:37] GK: Yes. I don't know. If somebody knows, please tell me. But I think a lot of it is really about just having an opinion, I think. I think a lot of companies, especially bigger companies, their opinion just becomes like whatever makes growth, right? Whatever makes the number go in the right direction. I think that's a way to make the number go up, but it's not necessarily a way to make a great product. The reason that they choose to do that is just because they don't really know what to do. And they don't have a vision or an opinion for what the thing should be. It's just whatever will grow the most. So, we're lucky enough to be a design-led company. So, have some real opinionated leaders in the best way that like they have a vision and they know kind of what they want to see in the world. It's just a lot of exploring and seeing what feels good. What do we think we want to use? Then kind of banking that there's probably other people out there like us that also would want something like that to exist. [0:46:36] SF: Awesome. Well, George, Josh, thanks so much for being here. Thanks for doing this again. I think we made it. We did it. [0:46:44] GK: It's even better the second time, I think. It's the now that - that one's the second world's best podcast and this is the first one. [0:46:51] SF: Awesome. I hope that's the case. Cheers. [0:46:52] JD: Thanks a lot, Sean. [END]