|
|
@@ -0,0 +1,132 @@ |
|
|
|
--- |
|
|
|
title: "Thomas Stachura" |
|
|
|
status: "Auto-transcribed by reduct.video with minor edits by James Parker" |
|
|
|
--- |
|
|
|
|
|
|
|
James Parker (00:00:00) - Thanks so much for, um, for taking the time to speak with us, Thomas. Um, perhaps you could just begin by telling us a little bit about yourself and your work and how you come to be heading up an organization like paranoid. |
|
|
|
|
|
|
|
Thomas Stachura (00:00:14) - So I guess, um, the concept, uh, started when I was in my brother-in-law's, um, suite, uh, apartment and my kids were there for the first time. And it was the first interaction of my kids with any smart speaker, because of course we don't own one at home. And I found myself entertained by their, you know, repeated attempts to request things out of this smart speaker advice device. But at the same time, I found myself, um, contemplating what it would be like to have it at home and feeling unease surrounding that. In fact, I was on uneasy, even in those initial questions with somebody else's smart speaker with having my kids recorded and, you know, uh, what parts of our conversation might be captured or not. And perhaps I'm being quote unquote paranoid about it, but it, it caused me a lot of unease. And in that moment being sort of an inventor mindset, I said, they've got, there's gotta be a solution to this. |
|
|
|
|
|
|
|
Thomas Stachura (00:01:20) - Why can't I have the best of both worlds, the, the fun and excitement of watching my kids play with the smart speaker, but without the anxiety of it, without the worry about whether I should even worry about the privacy and that battle, like, do I have to make a choice? Should it be tech or privacy today? Which one do I want? And so that prompted the concept, which later prompted patents. And then of course the full business model, my background, going into business, going further back. Um, I grew up, you know, programming and software development in my mother's basement. That's, that's really where I got my start. And from there, it evolved to running a business in order to, uh, give myself an opportunity to invent more software. So I'm an inventor at heart. And what I really needed to do was to find a business that would allow me to just be an inventor. |
|
|
|
|
|
|
|
Thomas Stachura (00:02:17) - And that's remarkably hard now, businesses don't just put out job postings for, Hey, we want an inventor with interesting ideas that he wants to explore. Typically that doesn't happen. And so out of necessity, I got into even learning how to program running a business, becoming a sales guy than a project manager, all the way to executive leading a team. And now we have a hundred people at pleasant solutions. Uh, all of that was out of necessity to just do inventing, to just have those golden moments, uh, much like a lawyer's golden moments are to say, you know, you can handle the truth, but really there's weeks and months of prep to lead up to that. Um, I wanted my golden moment where I'm watching my kid play with a smart speaker and my goal, the moment is aha, I've got an idea to stop the privacy issue from happening and still enjoying the tech. Fortunately, I have this a hundred person company that can now execute on it and actually make it a reality. And so I felt vindicated that I built out a company that can actually take my ideas and bring them about to real life. |
|
|
|
|
|
|
|
Sean Dockray (00:03:27) - What would you say is the proportion of time that you get to spend on R and D versus sort of, um, just the commercial work that sort of funds that more interesting and venting R and D part of the company |
|
|
|
|
|
|
|
Thomas Stachura (00:03:40) - I'll answer that, uh, both for myself, uh, being the inventor versus manager and for the rest of the company as well, for myself as an inventor, I would say that I get maybe 2% of my time, uh, actually inventing. And sometimes it comes in a concentration where for two days I have to write out all of my ideas into a patent because it's now or never. So for two days I have to come up with every possible way to do this so that we can pass into every possible way. Um, but usually there's at least 5% additional time where it's inspirational and not necessarily I'm inventing and seeking inspiration. I'm not w you know, watching the clouds pass by, uh, or, or, you know, watching videos that are inspirational, but rather doing things that are inherently inspirational, um, where I'm motivating a team member, I'm hearing their problems, I'm hearing what things they want to resolve. And so in general, I would say that's an additional 5%. So a total of maybe 7%, uh, the rest of it is just literally running the business and not getting to say, you know, um, uh, you can't handle the truth, so to speak. |
|
|
|
|
|
|
|
Thomas Stachura (00:05:01) - The company as a whole. Um, it depends on how you define R and D because even if you think about for clients, we do a ton of R and D for clients, but that's sort of billable work for science or where perhaps we're. I mean, I guess if you consider the D development software development, almost everything we do is developing stuff. So if I were to define R and D as the high risk stuff that we don't know whether it'll actually work or we it's the first version of something that we hope will work, I would say we only get maybe 10, 15% of our time on that currently, but it's increasing. And then we get a substantial amount of time executing after that. So paranoid, um, was conceived by me. We got a prototype with one other engineer and then to actually bring it to market. We now have seven engineers working full time, mechanical and electrical as well as consultants. So to an extent, if you consider that R and D still, even though, by my definition, it's no longer a risk. We know we're developing different models, increasing compatibility. It's no longer the big R it's more of the big D |
|
|
|
|
|
|
|
James Parker (00:06:21) - Th the sort of Eureka moment that you described. Um, how long ago was that? Now? |
|
|
|
|
|
|
|
Thomas Stachura (00:06:28) - I don't know. I'd have to look it up. I'm not the best for remembering timelines, but I would say somewhere on the order of three years ago. |
|
|
|
|
|
|
|
James Parker (00:06:37) - So, so it was a smart speaker because smart speakers, you know, I've obviously been not, not been around very long and sort of achieved a pretty major market penetration very quickly, but it was a smart speaker rather than, you know, um, uh, a digital voice assistant on a, you know, an iPhone or something that produced that moment. So, |
|
|
|
|
|
|
|
Thomas Stachura (00:06:56) - Yes, yes. And it was my first interaction as a family interacting with a smart speaker. And in fact it was Google. Um, I don't remember what model number, but it was a Google one. |
|
|
|
|
|
|
|
James Parker (00:07:08) - Um, and so, so, so you founded the company three or so years ago out of this, this sort of Eureka moment. Uh, and so w w was, was paranoid the kind of the first, the first product that you started working on and then pleasant solutions developed out of that. And what's the relationship between paranoid and pleasant solutions. I mean, are they all, they all sort of privacy oriented, you have a, somewhat of a similar mission, or is it that they will in audio? What's the relationship? |
|
|
|
|
|
|
|
Thomas Stachura (00:07:38) - So actually pleasant solutions is 13 years old. Um, and it's been around for a while with a hundred people and paranoid is a couple of years old. We didn't start it until it's meant to commercialize the patent. And so the relationship is that technically speaking, a sibling company in the pleasant solutions group of companies is the relationship. And one interesting element of that is that the companies are very, very different and premise pleasant solutions does make use of Google analytics. We do make use of a lot of privacy suspect tools. Uh, our motto is experts. You wish you called the first time we make data analytics, we've done artificial intelligence systems, all that kind of stuff. And when we started paranoid, I knew I needed to segment it off completely to be a company that has a foundational allergy to data. And so it took a little bit of work and our motto is earning lots of money by increasing privacy, not eroding it, um, as well as uniting tech and privacy get paranoid. |
|
|
|
|
|
|
|
Thomas Stachura (00:08:52) - And so there was, there was a little bit of a friction there at first, for example, all of my web design teams and my analytics guys and my, my marketing team, when I told them, no, our websites can have no cookies. They said, okay, well, we have cookie free Google analytics. I'm like, Nope, we're not using any third-party analytics. Then they said, okay, well, we can make use of this platform that does heat mapping. Nope, we can't do that. And they're just, you know, panicked about, well, we've got no data, we're getting nothing. And I said, we're going to do this old fashioned way. If we want a little bit more insight, we're going to call up a few customers or we'll pay for customer surveys, like by, you know, uh, focus groups or whatever. They, the old fashioned way. So paranoid is actually flying blind relative to what we're used to at pleasant solutions, because we have no analytic data about our website. Uh, it's absolutely minimal. And in fact, we use it as a pitch at the bottom of our website, no cookies. And let me tell you, it is painful. |
|
|
|
|
|
|
|
Thomas Stachura (00:09:55) - It is painful because I don't know, every time we do a podcast, every time we do an interview or, or a press release goes out, um, all we can see is basically like geo cities of old, the, the counter going up, like, okay, we got a lot of hits today from who knows where, but we got a lot of hits today. And so it seems like it was probably this press release. So there was quite a bit of cultural separation required to get the team to understand that, uh, it's a very different premise, treat it like it's a client. They want a completely different from what pleasant solutions is going to do. And that's where we came up with models like allergic to data. Cause it helps our team understand just to what extent we're going to avoid data. Um, and we're going to be spending time researching, not how long can we keep people's data on the store, but we're going to find out what is the legal minimum, where we can just start deleting data. Do we have to retain their phone number? Can we delete that? Do we have to have a receipt number? Can we delete that? We're going to be looking at every turn, where can we eliminate data where we just don't know anything about our customers as much as possible. |
|
|
|
|
|
|
|
Joel Stern (00:11:11) - Um, Thomas, could you just say something about the, um, kind of political imperative to, to delete that data? I mean, what, what sort of drives that desire, um, for you in, in the company? Why, why is it so important? |
|
|
|
|
|
|
|
Thomas Stachura (00:11:25) - I think there's a lot of companies out there making statements about trusting them with your data. And I feel like I don't want to have paranoid, just be another company that says, yes, you can trust us with your data. We are creating products and corporate structure where you don't need to trust us because the structure implicitly does not allow us to violate your privacy. So for example, the paranoid product itself has no connection to the internet, has no software update while it has software updates, but they're one directional. And I can talk a little bit about that later on. It's quite innovative how it it's one directional in a way I don't think has ever been done before. And so people can trust the device. Even if the person manufacturing it, they don't trust because they can verify themselves that that device has no wifi, no Bluetooth, no capability to connect to the internet, nor do they program it to their wifi, nothing. |
|
|
|
|
|
|
|
Thomas Stachura (00:12:33) - And so they don't have to trust us same as a corporation. Um, it's a little harder, but we want to create ways that they can trust us. For example, we don't have cookies right there. There's no JavaScripts related to Google analytics or any analytics that, that are external to our company. So now they don't have to trust us that we're policing Google analytics with their privacy. Google will not know that they're on our website unless you're using Chrome and that's a different story, but that's a browser issue. So ultimately what we want to keep eroding our own ability, our own choice to even violate our customer's data. So that at the end of the day, customers can say, you know what? I don't trust Thomas. I don't trust paranoid, but there's nothing they can do to hurt me because of a, B, C, and D. And that makes me, I think that makes the company very different than other companies. And I think it would be very hard to replicate what we're doing from a trust level. Other companies can't go and do that. Google could not become allergic to data, Amazon, Apple, Microsoft, none of them could become allergic to data like we are. |
|
|
|
|
|
|
|
Sean Dockray (00:13:49) - Well, I was just going to say, although I think that's, that's definitely true. These, these companies whose whole business model is built on the accumulation of data, couldn't do what you do. But then there's that kind of gray area in the middle of so many companies that are ostensibly doing something kind of unrelated to data, like their core purpose is not about the accumulation of data and generating kind of advertising, but they still kind of fall into the trap of, you know, or kind of what you were saying about the people, the people in pleasant solutions, just sort of saying, you've got to be collecting data because you need information and data in order to make decisions. Right. So, um, I think that, although it's, yeah, it's, it's both innovative, but I do think it also kind of is conserve as a model for companies more generally, like of course, Apple and Google can't do it, but I'm just thinking of all the other smaller companies that potentially could, um, could learn something and maybe, maybe sort of adopt some of the practices. So I think that's very interesting, um, |
|
|
|
|
|
|
|
Sean Dockray (00:14:59) - And it also is just opening my eyes to how paranoid it's not just, you know, that device that sits on a smart speaker. Right. It's um, it's also, you're designing the company and some just, I guess I'm wondering what plans you have in store, um, for further development, like what future products and future kind of, um, I don't know, even organizational innovations that you're talking about. |
|
|
|
|
|
|
|
Thomas Stachura (00:15:27) - So there's a few things to respond in there. Um, in terms of other companies not making it data is their business, and yet they end up collecting data. I would ask anybody how many companies end up doing major pivots over the course of 10, 20 years. And when you think about that, how many companies are currently deliberately deleting data because it's too big for them to store, that's not really happening. And so the data they're collecting now, even if they're completely ignoring it 10, 20 years from now, I huge portion of the companies are going to be pivoting. And is it common for companies to pivot towards data analytics and more advanced machine learning and all that? Absolutely. It's a global trend for companies to pivot from something more basic towards data analytics or machine learning. And so when I look at these overall trends of collect the data, whether you need it or not never delete the data companies generally pivot to over the course of time. |
|
|
|
|
|
|
|
Thomas Stachura (00:16:34) - And a lot of companies are pivoting towards machine learning and accessing data in whatever way, all of those come together to say, you know what? I don't trust a company that says we're collecting the data, but trust me, we're not using it. Not to mention, of course, the poor track record of companies, even legally when they're not allowed to, or whether they promise in their contracts that they won't, they don't have a good track record of abiding by that. I would say in terms of where the company's heading, I think it's important to look at sort of our, our mandate, uh, earn lots of money by increasing privacy. That that was very controversial. Um, it speaks to what we're going to be like as an organization, radical transparency. It's not just about data allergy. It's about radical transparency. And so we actually had a COO in our company, again, talk about culture shock. |
|
|
|
|
|
|
|
Thomas Stachura (00:17:32) - This is why other companies can not do it very easily. When I put that mandate up. And I said, well, look, I have not a privacy advocate for the last 40 years. I'm not going to pretend to be I'm here to earn money by increasing privacy. That's that's my market opportunity. They actually took it down a couple days later, without my knowledge, they did a little coup and they took it down out of concern for our company's wellbeing. And I had to actually have a stern discussion with them to put it back up and who authorized this? How can you do this? Like that was the company mandate. Don't, don't water it down. And so it's not just about ways to not collect the data, but it's exploring ways of being, not for the sake of friction, but for the sake of radical transparency, relaying to the public, what we're about, you know, are we going to scare away the public by saying, we're here to make lots of money? |
|
|
|
|
|
|
|
Thomas Stachura (00:18:31) - Cause you know what, if Google said that people just get more alarmed because they already knew it. But now it feels like if they're saying it must be 10 times more true, something's going to go down, they're going to abuse. Everybody's privacy, something. So with us coming out of the Gates saying it, hopefully they're going to say, well, you know what, that's at least some statement I can believe in. Okay. I believe that you're here to make lots of money. I'm going to keep reading about this company. Maybe you're telling the truth about the rest of it. So it's about not only the data allergy, but about radical transparency. And what does that involve? I don't know. I, pleasant solutions is not a radically transparent company that way. So I don't even have a model behind paranoia to follow of a company that is that radically transparent. Um, maybe charities, I don't know, |
|
|
|
|
|
|
|
James Parker (00:19:23) - Sounds like you're describing or you're sort of the business model is a sort of a wager effectively on the, sort of the possibility of a privacy industry like that as a sort of counterpoint to, you know, whatever you might call it, surveillance capitalism or platform capitalism or something, and a new market in which privacy is the sort of is the selling point. I mean, is that, does that, does that industry exists. That you know of, or are you trying to invent it or do you have allies, you know, um, in that market space or when I was reading about the company, I just thought, it sounds like, sounds like they're trying to produce a new business model. Uh, that sort of, it's not exactly, is it parasitic on, you know, you know, mass data collection or is it antagonistic to it, but it's either way it's kind of in relation to it, |
|
|
|
|
|
|
|
Thomas Stachura (00:20:23) - The market absolutely existed. There were lots of people worried about surveillance, but they were all crazy, right? I mean, up until recently, everybody was just crazy for the last 20 years. No, I mean, that's the perception the market existed, but it was all tra niche, ultra fringe, not mainstream accepted and something that, you know, you should just learn to accept trusting your government. And as a side note, I trust my government far more than I trust corporations to each his own, I guess. But now I feel like the market is becoming mainstream. And the reason behind that is because I think there's a lot more to fear behind corporations than government government, at least on its surface claims to be for the people on the surface. Corporations do not even claim to that. In fact, it's illegal in many cases to favor too much towards the people, rather than the shareholders and making money, you legally are obligated in the U S to focus on making money. |
|
|
|
|
|
|
|
Thomas Stachura (00:21:32) - So there are cases I'm sure where if you donate too much to charity and the shareholders don't like it, they can say you're not working in their best interest and they can Sue you for it. So there's a very different premise there. And I think it creates a structure that is much higher risk and up until now, I think that corporations didn't have quite as much power, whereas we're entering a time where, you know, I always wonder when, what global government happened and now I'm thinking it's happening. It's just not going to be the politicians and geographic countries. It's going to be global government because corporations are going to piece by piece takeover infrastructure. If you think about how countries used to run the post offices, I mean, I don't know about every country, but certainly all the ones I'm familiar with it was run by government for the people and was not intended to be a profit center. |
|
|
|
|
|
|
|
Thomas Stachura (00:22:27) - Uh, and if it did make profit, they would divert it to other departments. Now our most vital form of communication. And I will say that email is more vital to our communication. Then the postal system is run by corporations, our most vital communication tools instead of telephone, which was heavily regulated. But yes, had corporations. Internet is largely the domain of completely D regulated, uh, corporations. And so corporations control our access to information. And, and a lot of this, you start thinking about like, well, Google controls, our search, Google can suede, uh, what kind of results I get if they wanted to and swayed my voting and all that kind of stuff. At the end of the day, I feel we'll actually seeing the start of a global government run by corporations and that, wow, I just sound like I'm crazy like those people 20 years ago, but that's an opinion I don't dwell on it every day, but nonetheless, I think that's, what's creating a bit of a market. I think the market is that people are now starting to become concerned about corporations. And I think it's more valid to be concerned about corporations than government run infrastructure. And because of that, what was a niche market, I think, is becoming a multi-faceted market that's going to grow. And so is paranoid targeting that absolutely paranoid is positioning itself to create many tools to protect the privacy of customers. |
|
|
|
|
|
|
|
James Parker (00:24:05) - Okay. Could you maybe say a little bit about the product range that you have at the moment and you know, how they're doing and so on |
|
|
|
|
|
|
|
Thomas Stachura (00:24:10) - Currently the products, I'll talk a little bit about our smart speaker product. Um, it's an add on device that, um, you put on top of your device and, or built into the device and you purchase it as a separate add on after you've bought Google or Amazon or whatever, and it blocks it from eavesdropping until you say the word paranoid. So of course this sounds loopy to some like, so, okay. You say paranoid, Hey Google. Well then who's going to protect me from paranoid listening. Well, like I said, paranoid itself cannot transmit the data. So it's okay that it's listening because it has no connection to anything. If it's like a cassette deck in my basement might as well, listen, it's not transferring anything anywhere. |
|
|
|
|
|
|
|
Thomas Stachura (00:24:56) - It's protected and it blocks the microphone. There's a few ways to do that. And we have a few models. The first one is literally pressing the mute button. That is the first one that we were able to launch by pressing the mute button. Uh, the moment you say paranoid in theory, the smart speaker is going to be in mute mode every time. Um, you're having a regular conversation without talking to the smart speaker first and saying a paranoid in that model, as well as all of them. We've gone to the extent where it has a little dongle or a little antenna that watch us for the lights and watch us for the reaction of the smart speaker. If it does not show that it's listening by turning on its lights and activating and interacting with you, the microphone will be turned back off. So if we have a false alarm, we've gone to the extent of shutting down the microphone two seconds later, because the smart speaker didn't light up like it should. |
|
|
|
|
|
|
|
Thomas Stachura (00:25:56) - The second model is by way of jamming. We're going to be jamming the microphone depending on the exact device in different ways, in a way that's silent, but nonetheless, the microphone is drowned out with noise. Um, and we've gone to great lengths to try to make sure that it's, future-proof against machine learning, listening really closely. It's not enough to say, Hey, Google does nothing. It's not enough to prevent it from talking back to you. It has to be something where if the audio was stored 10 years later, nobody's going to get at it. Even if they put a lot of work to it. So we're really trying to jam the audio. The third one, apparently max is, well, of course the maximum we, we cut the microphone wires. Now, technically they're not wires they're circuits, nevermind that we're effectively cutting the wires to the microphone. And until you say the word paranoid, we actually won't let electricity flow through from the microphone of the smart speaker to the CPU of the smart speaker. So those are the three models that we've got currently. |
|
|
|
|
|
|
|
Joel Stern (00:27:04) - I'm interested in the, um, decision to focus on microphones and, and the sort of audio capacity of smart speakers. Um, you, you, you mentioned, you know, that you're that Eureka moment, um, a few years ago, but I'm wondering, is there something specific about Machine Listening and the kind of listening practices of smart speakers that made you want to focus the product range? Um, you know, so specifically on, on that problem and whether you could say something, you, you, you've spoken a lot about, um, privacy, but whether you could say something about your kind of fears and anxieties around, um, how the audio, which is captured by these speakers might in fact be used in ways that we want to protect ourselves from what, what your sort of, um, fears or anxieties, uh, around the capture of audio and the way in which these devices listened to us. |
|
|
|
|
|
|
|
Joel Stern (00:28:04) - Um, you've mentioned, um, privacy in general as a principle, which is very important, but is there something specific about, um, the capture of, of audio and the way in, and the listening capacities of smart speakers that made you want to focus on, um, them in particular, um, and developing products to, to jam and sort of, um, you know, circumnavigate the microphone, you know, because of course our project is called Machine Listening and, you know, we're very concerned with, um, surveillance and privacy discourse in general. Um, but we're also trying to think specifically about sound and listening and whether there are, um, problems in particular, um, that, that we can diagnose and sort of think about, um, in relation to those things. |
|
|
|
|
|
|
|
Thomas Stachura (00:28:56) - I think what worries me about corporate data in general, and I'll get to audio data specifically in a moment is that over the years, I feel like it's been demonstrated to me that people are creative and companies are creative too. And if you give them a resource, they will find many creative ways that I have not thought of to use that data. If I didn't think that companies were extremely creative, then I would think of all the possible ways that you can use data. And I would decide whether those are appropriate risks for me. And I would probably move on with my life and not worry about privacy. However, I know there are phenomenally smart and creative people in the world. And so whatever list I can come up with, I feel it's not even 1% of what's possible. So some of the usual ones that many people think of, okay, you're going to use it for insurance because you know about my health or my behaviors. |
|
|
|
|
|
|
|
Thomas Stachura (00:29:57) - You're going to use it to hire people as a more recent one. Um, maybe that wasn't obvious 10 years ago, people will go and say, well, what did they say on social media 10 years ago? Um, now there's politics because of the scandals. Maybe it's persuading people to impulse buy. That's an old one, but where else can it go? If I knew that answer? Um, or if I thought I had the monopoly on coming up with new ideas on using this data, then I'd be comfortable. The thing that scares me is, I don't know. I don't know how else they're going to use the data, but I know that it's going to be far beyond what I can imagine. And even for the list that I just mentioned, I would say maybe I'm okay with the impulse buying right now, but I'm also worried about how advanced it gets with more data. |
|
|
|
|
|
|
|
Thomas Stachura (00:30:49) - We all know with machine learning, the more data you get, the better it gets and maybe this, a new techniques that are not that way. But, so I'm worried about my choice having, uh, a dog CA on them for 20 years. And then just how deeply the machine learning will be able to manipulate them over time. I feel like I have formed a lot of my opinions. I, in today, estate agent, I'm not going to be as manipulated, but I think everything influences me, but my children, I feel will be at a disadvantage because they might still be at a developmental stage 10 years from now. Now they will have a lot more data on them. I heard too when I was a child. So there's going to be a more of a dossier. And the machine learning is going to be much more powerful. |
|
|
|
|
|
|
|
Thomas Stachura (00:31:39) - All of those things make me more scared for my children than I am for myself. Yeah. So those are the, the two things. Now audio data particularly scares me because I'm not exactly sure. I even more, I know even less about how the audio is going to be used. It seems like it's a smaller usage case. Perhaps they'll start to find my emotional patterns, perhaps they'll figure out what time of day is best to negotiate against me because of the tone of my voice throughout the day. Perhaps they'll know my hormonal patterns throughout the day. I don't know. I'm very limited in those examples, but yeah, I feel like it's very intimate knowledge. It's not stuff that I'm consciously in a few words of text giving. I am giving megabytes of information every time I speak. And there's so much data there. I don't even know what data I'm giving. |
|
|
|
|
|
|
|
Thomas Stachura (00:32:39) - So there's two elements to that one. I don't understand how it's going to be used because it's harder to predict two. It's not filtered by me. It's subconscious data. That's going out. It almost feels like if I knew the extent of how they could use my audio, uh, voice or video, it might be just as bad as starting to monitor my brainwaves. It feels just as pervasive to me as monitoring brave weight brainwaves, because I think there'll be able to get at the same data of what's going on in my brain. Just based on the tone, |
|
|
|
|
|
|
|
Joel Stern (00:33:17) - The example, um, of sort of being worried for the, um, development of, of, of children, um, as they're exposed to these technologies is, is a really important one also in thinking about audio, because I suppose very young children, um, are going to be interacting, you know, through speech with these devices probably before there, you know, for instance, um, on social media kind of using the computer itself. So perhaps, um, audio data is, is, is a way to, uh, engage in and captured, um, the sort of habits and behaviors of, of children at a, at an even earlier stage in their development. Can I just follow up on some of the really interesting things you said that almost the first one I'm thinking is so paranoid product range goes sort of targeted at the sort of the space around the wake word. So the problem is presumed to be that like Google or whoever is listening, even when you don't want them to be. |
|
|
|
|
|
|
|
James Parker (00:34:19) - Uh, and I think, you know, on the website you have a, you know, a statistic that flashes up about, you know, how many, um, accidental, um, how many times the, these devices tend to sort of switch on. You know, I was reading a piece recently about, um, how, okay, Google is triggered by cocaine noodle. Uh, and it, anyway, it was like a whole thing about anyway, whatever they get miss triggered. And, um, of course they're listening and we don't know because we don't know that we can trust them. We don't, it's not even just triggering it's, you know, it's like, well, you know, they really, when they say they're not listening, are they really listening? But there's a whole, there's an expanding amount of things that as you precisely, as you pointed out that can be done with the data, once the device is on. |
|
|
|
|
|
|
|
James Parker (00:35:06) - So, for example, there's a number of companies at the moment working on, you know, COVID voice detection, um, uh, and you could easily, you know, Amazon already had a pattern a number of years ago, whereby it would work out when you were coughing and start selling you cough medicine and so on and so on. So there's all sorts of things that a, um, that these smart speakers here and it's expanding when they're on. So, so that's the first question is, does paranoid have any kind of, um, intentions or ability to kind of deal with that space where, you know, when the device, when you've allowed the device to listen to you, what it's listening to you in the specific ways in which it's listening are growing and expanding. And so just because I consent for it to be on doesn't necessarily mean I can send it to everything that's being apprehended when it's on. |
|
|
|
|
|
|
|
James Parker (00:35:58) - So that's the first question. And then the second question is what about beyond smart speakers? Amazon just launched, um, the halo today. I don't know if you've seen all of the, um, the press around this. So this is a device that's like a Fitbit, or what have you, you know, sort of a health tracker. I mean, it, it, it wants you to do sort of full body, um, uh, scans of yourself using your, your, your phone. So it's for, for fat and so on, but it also has a speaker, I mean, a microphone embedded, which, um, doesn't require wake words. They say that it's not listening all the time. It only listens intermittently, but it's specifically listening for emotion. I have no idea what people think would be interesting or useful to themselves. I, you know, I can understand why I might want to know how much I'm walking a day, but I'm not totally sure why people, why anyone would want to get feedback on how aggressive they sounded today or sad or what have you. But anyway, that's already a thing. And so I'm immediately thinking, well, where's the paranoid for a halo. And so, yeah, I'm just wondering on both fronts, you know, does paranoid have any intentions to go in either of those directions or what do you think about that? Or are they just totally different technical problems that, you know, can't be addressed or, and so on. |
|
|
|
|
|
|
|
Thomas Stachura (00:37:23) - So in addition to the disadvantages we have in building a company where we're allergic to data, and we are going for radical transparency, which can scare some people and people will use statements against us, like, look, they said, they're earning lots of money. We have a third disadvantage in that our premise is to unite tech and privacy. And what that means is that we are not just defending privacy and sacrificing the features and technology. We are trying to innovate in a space where you get the full usefulness of the product, whereas maximum to the maximum extent possible you get the full usefulness of the product. It is much easier to do a privacy product, for example, where you can just go up and press the mute button or whatever, a fast way to turn it on and off, but that eliminates the voice activation feature. |
|
|
|
|
|
|
|
Thomas Stachura (00:38:16) - So now all of a sudden you can't do it by being lazy on the couch, which I want to be lazy on my couch. I get that. So I appreciate the tech portion. When you say, you know, why would anybody want to know your emotional map for halo? Uh, I can see that I would want to know the emotional map for myself and say, okay, well, it looks like a most cheery this time of day. That's probably the best time of day for me to do X task, you know, or maybe that's the time that I should be drinking more caffeine because I get grumpy or whatever the case is. So I can understand and value that because I appreciate the tech. I mean, we have a whole company that's devoted just to the tech side, pleasant solutions, not the privacy side. So that gives us three things that we have to worry about. |
|
|
|
|
|
|
|
Thomas Stachura (00:39:03) - And then we get confronted with absolutely. I, I understand on the privacy side, do they need to know your emotions when you just want to say, Hey, Google, what's the weather. Do they need to know that I'm feeling grumpy or not? Do they need to hear my kids in the background during that moment? Do they have to know when my fridge is running? I don't know how that hurts me, but maybe they'll know what temperature it is in my house based on how often my fridge turns on. I don't know, that's a creative one who would have thought the answer is we do have that. Actually we have patents already in place, and that is a product roadmap. We can't do it with the mute button obviously, but we can do it with the paranoid max. And it is a future potential Avenue. We'll go down where we will often escape people's voice. We will trout it in just enough noise so that you can hear it, or we will replay it with a robotic voice or whatever the case is so that all the devices getting is just the words. |
|
|
|
|
|
|
|
Thomas Stachura (00:40:02) - Check the weather or, you know, add something to my calendar and they will get nothing else, no background, no guests talking to you in the back or none of that. Uh, and as a side note, aside from kids, we're also really trying to protect guests. You know, things that are happening in the background. There are already cases where smart speakers are recording the neighbors conversations. So if you're in an apartment building, you don't even know if you're being recorded or not. What if there's a smart speaker literally right. Through a thin wall and right there, every time they're activating it, something's leaking through. And you might be saying something very important. You know, as a CEO, I compete with the number of companies. I care about my privacy from a corporate standpoint, corporate espionage. So that's another concern that I'm always thinking about. Um, so in general, I would say, yes, it's on our product roadmap and we have a lot of things to worry about how not to remove the features. So if they start adding a feature related to emotions, it's going to be very difficult for us to remove the emotions out of that without disturbing that feature. So at that point, we might have to give the user a choice. Do you want your emotions to go through if so it might turn off some features. |
|
|
|
|
|
|
|
James Parker (00:41:19) - And what about, um, the, you know, the horizon sort of beyond smart speakers, uh, like, you know, uh, micro paranoid that goes on your wristband or, um, you know, the back of your phone or something. |
|
|
|
|
|
|
|
Thomas Stachura (00:41:31) - I won't rule anything out, but I will say it's already proven to be quite a challenge to, uh, to do the smart speakers. Um, and there's a lot of fine engineering that has to go into that. Um, I think there's opportunities to do other things and we're definitely exploring other markets. I can say, for example, uh, some of our models we're exploring whether they can go onto smart TVs as an example. I wish I could say more, but I will say that there are some software cloud tools and there's, there's a whole bunch of markets for privacy and we're going to be pursuing a number of them |
|
|
|
|
|
|
|
Sean Dockray (00:42:07) - Just, uh, in light of the comments about, um, this kind of idea of a new global government, um, and the diminished role of actual government and what you had said earlier about paranoid, almost operating a bit like a charity, um, that it is a company and the, you know, in the mission is to make money. And I understand that, but at the same time, the way it behaves is a little bit like a charity. I was just sort of thinking about the way that it kind of suggests maybe, uh, an idea of how regulation can happen in this kind of new corporation based global government and that regulation. Hopefully we can still pursue it through the state at some level, but this is almost sort of saying we need to pursue regulation through technological means. Um, and that paranoids product range at some level is this kind of performance of, uh, technological regulation. But that's more of a reflection. I dunno if that, if that resonates with you or you just, |
|
|
|
|
|
|
|
Thomas Stachura (00:43:19) - So I would say that when it comes to regulating, I don't believe that government regulation is going to really stop the corporations because they will find a way around it. And the governments aren't, I would say motivated enough to be proactive that way. And again, I, I sound a little crazy when I start thinking, like, I don't normally talk about global government being corporations, but I think the solution is going to end up other corporations that have a mandate to hunt down privacy violations and other companies that I don't know if the government will offer a bounty, uh, that would be definitely motivating a lot of companies. If the government said, well, if you can find privacy violations for each person's privacy that was violated a minimum of a thousand people or whatever, you know, we will give you this much money. Now, all of us know, we we've privatized the idea of enforcing privacy. |
|
|
|
|
|
|
|
Thomas Stachura (00:44:19) - And I think that has a better chance of fighting against privatized interests to maintain the data. Other than that, um, I can give you two stories. I saw on my list that, um, maybe they fit in, maybe they don't. Uh, the update system that we had was quite an interesting and challenging approach because from a company product development standpoint, we knew we, we need to iterate. If we release software out there, we need to be able to do a software patch, but what paranoid, customer's going to want to install a software app or plugin the paranoid device to the phone. Now, all of a sudden we've lost this claim that nothing ever leaves the device. And so we started thinking at first about an app that can play sound over like a modem. |
|
|
|
|
|
|
|
Thomas Stachura (00:45:11) - To the device, but then we started thinking, well, how do we know that the app isn't receiving information too? You know, it doesn't access the microphone or not. And so on. So we actually landed on something. I think that's innovative and we're not a hundred percent sure whether it will work as well as we want, but you update it by playing a song on any MP3 player. So the MP3 player, of course you can choose whichever one you trust. We just released an MP3 file. Obviously it's not programmed. Then you can't program it to record information and send it home and you play the song and it updates the device so much like Google has lollipop or various candy names for their operating systems. We're going to have versions that are associated with music, which has its own challenges. I wonder with what are people going to think about the song while they're updating? And right now, as it stands, they might have to hear it a few times or leave the room, uh, if they don't like the song, but we found our way to do a one-way update into the system with nothing coming out. You could update it with a cassette deck in theory. |
|
|
|
|
|
|
|
James Parker (00:46:22) - That's fascinating. Can I just ask then, um, is the, the, the, the information that's being apprehended by the breeding received by, by the device is not audible, is it it's sort of, um, you know, we've been reading about this idea of, you know, internet over audio or data over audio and, and, and adversarial audio, and the different ways in which, um, signals can be sort of sent over audio, but not apprehensible to humans. Is, is that how it works? So you could in principle, you know, in Coda in Beyonce or whatever you wanted, or is there something specific about the song |
|
|
|
|
|
|
|
Thomas Stachura (00:47:07) - The system will allow for any song, uh, at least in theory and true to our premise of transparency, we're trying not to. And also for the health of pets, we're trying not to use ultrasonic or anything like that. Instead, we're trying to do it where let's just put it this way. This song will not sound like an audio file putting on his headphones and listening to an orchestra. It might sound like there's some interference or a little bit, you'll be able to hear some of the data, but you're not going to hear the ones and zeros. Of course, |
|
|
|
|
|
|
|
James Parker (00:47:38) - That's absolutely amazing. And what was the other example you were going to give? |
|
|
|
|
|
|
|
Thomas Stachura (00:47:40) - What I found very revealing is when we did our first commercial and paranoia.com. Um, I don't know if you've seen the commercial on the video site. Uh, we video studio just for that with an alien Saifai effects and everything going on. Like we went as much as we could, and we got enough attention that people started making mocking videos about us. So the video is two minutes and I think very entertaining, but I was entertained watching a 10 minute rant on YouTube about how bad our commercial was and the title was the stupidest product ever. And not just because at this stage, any publicity is good publicity, which is not always true. Um, there's an element of that in this case, but it, it came to, it gave me a really good realization. The reason why he was saying it was the stupidest product ever was not because he was saying smart speakers can be trusted. |
|
|
|
|
|
|
|
Thomas Stachura (00:48:42) - It's not a concern. Just, you know, regulation will take care of it. Government will protect us. No, he was saying it was the dumbest product ever because you shouldn't even own a smart speaker. Just don't do it no matter what, don't have one don't. And yet at the same time, he was competing against billions of dollars of marketing that are going to make smart speakers more useful. So we have this whole camp of people saying smart seekers are not that useful. Don't get them, but you know what, with billions of dollars and a lot of creativity, I bet you, at some point, you're not going to be able to buy a house without smart speakers installed, because they're going to be useful and builders will find them useful. And eventually there's no other way to open and close your garage. There is no button, cause it works better, just voice activator or whatever the case may be. And so we'll end up with a situation where a one-side billions of dollars is going into telling you, don't worry about your privacy. You're in good hands with government and or corporations. And then you'll have extremists saying don't trust technology. Absolutely. Under no circumstances, it is stupid to trust technology and they call the middle ground stupid. They call the middle ground stupid because it's not worth it to budge an inch. And where I had a great realization is. |
|
|
|
|
|
|
|
Thomas Stachura (00:50:06) - And, and as an inventor, I often look for gaps in the middle. There's a huge gap of, and I think the majority of people are in the middle, but there's a huge gap between the subject matter, like being discussed. There's people who are very much advocating and advocating privacy. And there's very much people saying who cares. And we saw that on our fate, Facebook advertising and all that kind of stuff. We saw that when we did the video ad half, the comments were very aggressively pro and half the comments were very aggressively against, and there wasn't a lot of like recognition that most people I think are in the middle. Everybody acted like there is no middle. And so that's where the market is for us. And as a side note, we openly disclose this on our cookie policy where other companies collect the data like Facebook or whatever, once it's collected, once it's there, we will absolutely make use of it. |
|
|
|
|
|
|
|
Thomas Stachura (00:51:06) - So when we advertise on Facebook, because we will advertise paranoid on Facebook, we will actually look at their analytics and look at their stats. We just refuse to collect that ourselves because once it's there, they've already got it. You're toast. Somebody's going to be using it. Have you. So I would say that our initial launch was quieter than we liked. And then all of a sudden, out of nowhere at the start of April, we started getting a many fold increase in our, um, in our sales. And I don't, I don't think it was the mocking commercial, which came out April. He thought it was an April fool's joke. So I know he found out about it on April's fools. Um, maybe everybody else needed to see that it existed for a while. Like when they research like, Oh, this, this came out in February, but what, for whatever reason, sales went quite a bit higher in April and we've been enjoying and see, there's an example of, I don't know if I had my analytics, if this was pleasant solutions, I could tell you exactly where all the people came from. No, we're going to have to be finding out the old fashioned way. Where did people find out about us? We have to ask them on the store order form. How did you find out about us? So the sales have been definitely increasing and enough that I know now that we're committing seven engineers to it. Um, and we're committing to additional product lines and we're expanding it. Like we're going full throttle. |
|
|
|
|
|
|
|
James Parker (00:52:32) - I mean, it sounds like for you, it's a good thing. If you know, Amazon echo keeps growing. I mean, we, it's a bit hard to find the statistics. We we've been told that, uh, voices, uh, smart speakers are the fastest growing consumer products since the smartphone. I'm not totally sure whether that's true, but from your perspective, because you want to unite privacy and tech, that's a good thing, right? |
|
|
|
|
|
|
|
Thomas Stachura (00:52:57) - Well, in terms of being the fastest growing, I would say I'm not surprised if it's more than phones or anything else because never before has so much been given for free out of the, over a hundred million smart speakers out there. A huge portion of them are free. And so of course, adoption, like I've never seen so much of a rush to give a customer something for free because they know the value of the data. They know this is the next Google search. You're going to be searching and talking to your smart speaker more than you type on your desktop computer. Why get up when you can just ask. So I'm not surprised if it's the fastest growing, but you have to factor in how many people are getting a gift for free from the company. Not to mention a lot of these are being re gifted and whatever. |
|
|
|
|
|
|
|
Thomas Stachura (00:53:47) - So many of those are not actually being turned on. I know so many people who have gotten the smart speaker and they say, well, it's turned off in the basement to which I respond. You should tell whoever gave it to you to also give you paranoid for free. Then, then you can turn it on and feel secure about it. So is it advantageous when everybody increases their use of technology? Absolutely. Our market depends on companies being not trusted, which I think I'm safe to say that's going to be a good market for a number of years. And it depends on technology collecting data, which I think I can also safely say is going to be a growing market, leaps and bounds. And I think it's almost a given for me that we're going to be growing leaps and bounds along with that. Do I hope that adoption comes about more rapidly with smart speakers? I don't care that much because if it's not that it's something else and whatever it is, we're going to chase down the ways to make it more private. And so I think there's already, you know, big enough market for me to chew on. I don't care if it goes from 150 to 300 million smart speakers. My market's big enough. I have to focus on just refining my message and getting the, you know, the sales and getting word out there and building trust. Um, |
|
|
|
|
|
|
|
Thomas Stachura (00:55:10) - Whether it's sending it to privacy advocates and letting them dismantle it and then, you know, speak on whether it's communicating to servers or not, or whatever the case is. Uh, that's my focus, so I I'm indifferent, but of course the market's going to be there no matter what I do. |