Skip to main content

RETAIL

Take the Pain Out of IoT Projects with Microsoft

Pete Bernard, Internet of Things, IoT interoperability, IoT friction, IoT development

Pete Bernard, Internet of Things, IoT interoperability, IoT friction, IoT development

The pandemic accelerated nearly every IoT project. Whether it is curbside pickup, online ordering, telehealth, wellness monitoring, or work-from-home services, organizations across the globe are putting IoT applications on a fast track.

But getting these new applications deployed has never been more complicated. Supply chains are experiencing disruption. New workloads like AI are pushing the boundaries of compute power. Interoperability challenges are slowing development. And many organizations are missing key skill sets and resources.

In this podcast, we talk about the key pain points of IoT development, the importance of having a developer-friendly platform, and how to address IoT interoperability challenges. 

Our Guest: Microsoft

Our guest this episode is Pete Bernard, Senior Director of Silicon and Telecom in the Azure Edge Devices Platform and Services Group at Microsoft. Pete has been with Microsoft for about 16 years, and is currently focused on telecommunications, mobility, IoT, and how to successfully drive disruption with innovative engineering and strong partners.

Podcast Topics

Pete answers our questions about:

  • (4:20) The major trends and challenges of IoT development
  • (5:27) The rise of intensive workloads and the role of edge computing
  • (6:43) How the pandemic has changed IoT app considerations
  • (8:43) Why being IoT developer-friendly matters
  • (12:57) The pain points of IoT interoperability
  • (22:05) How Microsoft is trying to reduce IoT friction
  • (24:43) The types of skill sets and resources organizations need to have
  • (29:03) A look at medium- to long-term IoT priorities
  • (31:47) How to make the life of an IoT developer easier

Related Content

To learn more about AI and edge computing, read Remove Friction from IoT Development with Microsoft. For the latest innovations from Microsoft, follow them on Twitter at @Microsoft and LinkedIn at Microsoft.

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and end users. I’m Kenton Williston, the Editor-in-Chief of insight.tech. Every episode I talk to a leading expert about the latest developments in the Internet of Things.

Today I’m talking about ways to remove friction from IoT projects with Pete Bernard, Senior Director of Silicon and Telecom in the Platform and Services Group at Microsoft.

So, what makes commercial IoT applications so challenging? Well, tight schedules, growing complexity, and the need for new skills in areas like AI and security are just a few of the issues that come to my mind. But I want to know how Microsoft is looking at these issues.

So let’s get right to it. Pete, welcome to the show.

Pete Bernard: Thanks for having me.

Kenton Williston: Can you tell me a little bit more about your position?

Pete Bernard: I’m responsible for all of our silicon partnerships for the edge, as well as our, kind of, telco work around 5G edge and AI. I’m part of the Azure Edge Devices, Platforms, and Services Group, which is a real mouthful. Sometimes we call it AED-Plus or AED-Please, if you want to be polite about it—which is all part of the wonderful world of Azure.

Kenton Williston: Perfect. And what brought you into this position—what did you do before this?

Pete Bernard: Well, I’ve been at Microsoft about 16 years. Came from Silicon Valley, came from sort of software-meets-hardware background. I was originally a BIOS engineer at Phoenix Technologies and did that for a while. And then did a little startup stuff, and did some mobile Java stuff, and eventually got to Microsoft. And so I’ve had a number of roles through the years here, mostly in that, kind of, more edge space, and especially where edge means telco, but also where semiconductor partners are involved.

Kenton Williston: Very cool. I didn’t know that—the BIOS stuff has always been really interesting to me. And of course it’s one of the things that’s very differentiated about when you’re talking about embedded systems for IoT versus PCs, it’s one of the things that starts to be really important.

Pete Bernard: It is. It’s one of those things you ask yourself: what happens once the power turns on, and what exactly happens on a board, how do all the chips get initialized, and how do they get powered—and all that kind of messy stuff under the covers that we sort of take for granted when we use these devices, right? These days, a lot of software development— very high-level software development—is very visual, but when you get down there onto the board and to the metal, that’s pretty tactical and practical, but it’s fun.

Kenton Williston: That’s a really awesome segue into what I wanted to talk to you about today. So, I’ve been working in the embedded/IoT space for—I’d rather not say that it was 20 years, but it is. And back in the day—and even until quite recently—these things, like just the frustrations of trying to get a board working and trying to get your operating system loaded—I mean, just getting those basic things done has just been a huge headache. And, as time has progressed, of course these systems become more and more complicated, more and more interconnected. So the level of effort that’s required to, kind of, from the ground up build something has just got to be huge.

Pete Bernard: Right.

Kenton Williston: And I think there’s a real pressing need today to really simplify things. So you can get to the value, and not just be spending all your time on the basic bring-up-and-try-and-get-everything-just-working.

Pete Bernard: It’s a very solutions-oriented market out there, I would say, especially in what I would consider the edge ecosystem. If you’re in other ecosystems maybe it’s a little more homogeneous, a little more straightforward. But out here on the edge, if you talk to commercial customers, like we do all day, they have some pretty complicated problems, and those problems are not solved with a single device or a single app.

And so quite often it’s about how do these systems work together, how do they work together securely, how can they be managed, how can they be deployed. You’ve got an ROI envelope, right? That all this stuff needs to get into. There are CapEx costs, your OpEx costs; what workloads run on the cloud versus the edge or the near edge. So the solutions right now are quite complicated. And, yeah, I mean, you can’t afford to spend that much time sort of bringing up your board. You’ve got other fish to fry.

Kenton Williston: Yeah, absolutely. And so, to that point, you mentioned some of the key areas of concern you’re seeing, such as, for example, security, which is a huge one, as these things get all interconnected. So, what do you see as some of the major trends/challenges that folks are looking to tackle these days?

Pete Bernard: Like I said, it’s a heterogeneous space. So, quite often you’ve got lots of different things connected. And how do you have a kind of a developer fabric over that solution so that you can develop and deploy solutions? So, for example, AI models that are maybe trained on the cloud and Azure, and then get deployed to the edge—how are they optimized for the silicon on the edge? And I know we’ve been working a lot with Intel®on OpenVINO™and working well with Azure, and a platform we call Azure Percept, which we launched in March. But that’s just one example of where the silicon characteristics and the capabilities of the silicon—which are really becoming pretty amazing—you really need to be able to take advantage of those to get the performance and the power to solve those problems. So, that’s just, kind of, one area that we’ve seen.

Kenton Williston: Perfect. And you give me a perfect opportunity—speaking of things that are perfect—to note that this podcast is an Intel production, so take everything we say from that perspective.

I do absolutely agree, though, with that said, that I think some of the things around AI and machine learning and computer vision and all that cluster of technologies have been incredibly important. And I think that’s probably one of the single biggest things that has changed, I would say, over the last two or three years—is the rise of these really intensive workloads that really need to be done at the edge because you’re talking about just such massive amounts of data. Like, if you’ve got a vision system, you can’t just pipe that stuff back up to the cloud; it’s just not practical. So you need a lot of compute at the edge.

Pete Bernard: It’s interesting. We’ve seen sort of the evolution of, kind of, standalone systems, or disconnected systems. And systems connected to the cloud that can just send data up to, kind of, simple sensors. And now we have the cloud talking to the edge, and the edge talking to the cloud. And now we have the edge itself becoming more—I don’t know what the right term is—more, I don’t want to say self-aware, that sounds pretty scary. But basically you can run a lot of compute on the edge and sort of asynchronously from the cloud, and you have to really figure out how those things can work together. But you’re right, there’s a lot of new scenarios out there where you need some pretty high-performance edge computing, and maybe that’s for privacy reasons, maybe that’s for bandwidth and kind of egress reasons, like you mentioned. But it’s pretty exciting to see some of the innovation that’s happening out there, and the way people are using it.

Kenton Williston: So, a big question that’s on my mind, if we’re going to be talking about recent trends, is, obviously the world’s been turned upside down over the past 18 months or so, with the pandemic. And has that changed what people are considering with their IoT applications?

Pete Bernard: I think so. I mean, for sure. I mean, there’ve been many heroes of the pandemic, but the internet is one of the heroes of the pandemic, right? It’s kept us all connected and working throughout this whole thing.

But I think people are thinking a lot more about—other than just general acceleration—like, everything’s sort of accelerated. All of the experiments we had—that we were cooking, that we were going to do two or three years from now—have all been deployed, but you’re seeing a lot more AI-vision work. Maybe some things like, just curbside pickup, and, kind of like, online ordering, and all kinds of systems like that.

Automation has really accelerated a lot, I think, through the pandemic. And then we’re seeing a lot more things around healthcare and optimizing that, and using a lot more information at the edge to make sure people can have a smooth experience, and an authenticated experience. The pandemic has just sort of hyper-accelerated a lot of the stuff that’s been in the pipeline.

Kenton Williston: Boy oh boy—supply chain management has become so much more challenging, and having some technology help with that, I think, is going to be a really good thing in the long term.

Pete Bernard: I think actually one of the unsung heroes of the pandemic has been the QR code. Who would have thought QR codes would make such a comeback? When you go to a restaurant, everyone knows what a QR code is. They know how to look one up on their phone, and they’re like, that’s become like an interesting little by-product.

Kenton Williston: So, I want to come back—you were talking about some of the large suite of offerings that Microsoft has, and kind of how that relates to this question of, okay—so we’re already have deployed, in pretty much any industry you look at, a lot of technologies that people have been talking about for some time and just got hyper-accelerated.

Pete Bernard: Right.

Kenton Williston: And we have to figure out where to go from here, and that could be just a question of, how do we integrate these things back into our larger enterprise systems? Because this is all done in kind of an ad hoc way, and how do we secure them? How do we build on this going forward? And I think an important part of that conversation, like you said, is making these things a lot more developer-friendly and providing a platform where you can readily make new decisions and bring systems together in new ways. So, I just want to hear a little bit from you—what does that mean in practical terms?

Pete Bernard: Right. Yeah, no—I think you bring in a good point, because a lot of these systems that are getting deployed are not—they can’t be—sort of one-off, bespoke systems, right? You can’t, sort of, hard code your solution this year, and then hard code another solution next year. So you have to sort of think about, what are the problems you’re going to tackle today, versus two or three years from now, by adding more and more capabilities. So, we’re seeing people do—one example in retail is point-of-sale terminals, right? They’re sort of pretty prevalent right now, as point-of-sale terminals have been for many years. People are saying, “Well, I have—now I have these point-of-sale platforms. What else can I do with those platforms? Can I add AI vision to those point-of-sale terminals to provide some kind of security monitoring in my retail store?” Or, “Can I then add some other sensors, and other things?”

And so we’re seeing platforms like point-of-sale terminals, humble platforms like that, actually becoming kind of edge endpoints in and of themselves, that can do quite a bit of compute. And, actually, you can start adding on more and more capabilities to the platforms like that that you’ve already deployed. Or we’re seeing—another thing that we’re seeing is, quite often we’ll go into a retail store or something like that—I won’t name who it is—but there was a supermarket we were talking to recently—not in the US—and the first thing they said was, “We already bought the cameras.” So it’s like, “Last year we bought all these cameras, and they’re sitting in boxes.” Okay. Well I say, “It’s all been deprecated or appreciated—whatever.” So a lot of people have legacy equipment.

So, how do you connect legacy equipment, and make that more AI capable, and more secure, and more manageable, right? So, we’re seeing a real explosion in the use of gateways and, kind of, medium edge equipment that can connect a lot of brownfield—we call it brownfield, or legacy—equipment into the cloud securely. It can be managed, and then you can run AI workloads against this kind of legacy equipment. And so that’s another big thing that I think people are thinking about as well. So, you sort of have to think about it systemically, right? It’s like you have some problems you need to solve, but what equipment do you currently have that you can leverage to solve that? And then, how do you want to sort of scale that as inevitably, right—we’re talking about Intel—inevitably the chips get better and faster and cheaper, and all that good stuff, right? So next year, or the year after, this is going to be even better equipment to connect into the system.

Kenton Williston: Yeah. And so, specifically, what’s Microsoft doing, you know with this whole—you’re going to have to help me with that acronym—the Azure A, D, IOP—all those, all those letters...

Pete Bernard: Right.

Kenton Williston: Yeah.

Pete Bernard: One of the things we’re doing is we have—obviously Azure is an incredible business, cloud business. And let’s just—I think it’s 95% of the Fortune 500 run their businesses on Azure. Some incredible statistic like that. And what we’re doing, from our group, is really helping out on the edge. So, what are all the interesting things connecting to Azure that are kind of getting superpowers from Azure. That’s what we’re really focused on, and making sure whether that’s working with Intel, for example, in Windows IoT. And we have something called EFLOW now, in which you can run Linux workloads on Windows.

So you can have a secure and managed Windows platform that everyone knows how to manage and deploy. And, on top of that, you can now run legacy Linux workloads that can be run from the cloud. Those could be AI workloads, or any kind of legacy Linux workloads, as opposed to having a separate box that runs Linux. So, that’s kind of one example of something that we’re doing out of our team that’s kind of really helping customers solve problems with the equipment they have, with a little bit of new software, which is pretty cool.

Kenton Williston: Yeah. So I think what you’re really pointing to here is a—one of the key pain points, or friction points, you might say, of interoperability, right? So you’ve got, like you said, all of this brownfield/legacy equipment, which doesn’t necessarily mean it’s horrendously old—even something that you just bought last year—like you said, “What do I do with this box of cameras now that I have them?” Right? And, in fact, you may have stuff that’s very old. Maybe you’ve got a point of sale—the device that’s three, five years old; maybe you’ve got an industrial automation system that’s 10 or 20 years old, right? You’ve got just a really broad array of things that were built at different points of time, with different ideas about whether they would be connected or not, and what kind of electronic infrastructure they would be interfacing to, if they were connecting. So, I imagine this is just one of the key things that people have trouble with. Would you agree with that?

Pete Bernard: Yeah, no. I think so. Because, like you said, I mean, everyone’s got sort of their own unique circumstances, right? They have a problem to solve. They have kind of an ROI envelope they need to get that problem to fit into—it’s worth only so much money for them to fix it. They probably have some equipment already that they don’t want to chuck. And hopefully we and our partners are able to work with these companies and find creative solutions that help solve those problems. I mean, getting pretty practical about it, frankly. And what’s exciting, I think, is that there’s no shortage of ways to solve problems these days, right? So even the typology of solutions of how much workload am I running at the far edge, versus a gateway, versus a co-located heavy edge server versus the cloud—there’s a lot of variability there.

So that’s good news, in that there’s lots of different ways to solve problems in a very cost-efficient way, very low-CapEx way. But I guess the downside of that—I guess that’s why we get paid—is that it’s complicated, and kind of your earlier point is it can’t be so complicated that nobody can actually deploy the darn thing. So you have to take friction out—with developer tools, with platforms. If you look at all the SDKs we have up on GitHub, and developer outreach, and engagement—we have an enablement, and we do that with Intel and all of our partners, so that people have, like, blueprints and reference platforms and reference design, so that they can get, sort of, 80% there right out of the box. That’s really what we’re trying to do, is take the friction out and give people the power of all of the optionality that’s out there without making it too complicated.

Kenton Williston: Yeah. And, I think something that comes to mind—hearing you lay out this landscape—is how much things have turned from an, “I’ve got a box that does something” sort of mindset on how you solve things; to, “How do I create some software to solve this problem?” Right? And then you get into questions, of course—like, where does this software live, and all these sorts of things. But, fundamentally, I think that, pretty much across the board, Internet of Things–applications have really evolved to a place where the questions are more about the software model than the particulars of the hardware.

Pete Bernard: Yeah. And I think so, because the software has to really work across many different pieces of a system, right? Different pieces of hardware have to all work together. And that’s where the software really comes into play. You have your business logic and your power apps, and all that stuff running on top. And then the hardware is there to either collect the data, right? If it’s kind of a sensor or AI-vision thing. And, in some cases, obviously provide some kind of UI and input, if it’s a point-of-sale terminal or a McDonald’s kiosk, menu-ordering system, et cetera. So it really is a very software-driven exercise. I mean, we’ve seen, if you look at—I forget the stats—something like 7% of all Tesla employees are software engineers. And I think in General Motors it was somewhere like 1% or 2% are software engineers.

It just shows that, like, newer companies, or companies that are disrupting spaces, have a lot of software capability. I actually do a lot of mentorships for college students, and they always ask me, “What companies should I work for if I want to be in tech?” And I always tell them, “Well, pretty much any company is in tech these days.” Right? Because. . .

Kenton Williston: Absolutely.

Pete Bernard: Everyone, whether you’re at McDonald’s, or Chevron, or whoever, they have—it’s tech, and you have to be tech capable, and you have to have software capability built into the company. So, from a career perspective, that’s exciting for a lot of people coming out of college and getting trained and stuff, because they can pretty much work anywhere as long as you have software capability. But to your earlier point, yes, the software is really critical, and we’re taking advantage of a lot of new semiconductor capability on lower power, higher tops, higher performance, lower cost. And that’s going to continue, but really it’s the software. And maybe I’m a little biased coming from Microsoft, right? It’s really the software that can unlock some of these amazing scenarios.

Kenton Williston: Yeah. And I will co-sign under that bias, because I think, again, what I’m seeing is just that the capabilities of the hardware have gotten to be so incredible. What you can do with a relatively inexpensive, relatively low-power piece of hardware is so amazing. And I think you can kind of see this even in the consumer-IT worlds, right? So, there was just an iPhone 13 dropped, right?

Pete Bernard: Right.

Kenton Williston: It’s a nice phone, right? But it’s like, well, how much better is it than the iPhone 12? Well, I mean, it’s pretty similar, right? Because just the hardware is so powerful that you just don’t really notice a big difference with a hardware upgrade from your day-to-day use. And same kind of thing with laptops and lots and lots of other places that are, you know, just kind of every day, individual experiences you can kind of see for yourself that it’s like, well, yeah—the hardware is getting better. But what really matters is, okay—the iPhone 13 is out, but really what’s cool is what’s in iOS 15—is what the software does. It’s really unlocking something new and exciting.

Pete Bernard: I’d say that’s true. Although we’re also seeing with new fabrication processes and other things there’s—especially maybe in the AI space (I spend a lot of time there) and the AI models require a lot of horsepower. And in some cases, like, you go back to solutions that customers—the problems they want to solve. And they require AI models that are really pretty heavy edge, pretty processor intensive still. And so we’re starting to see that AI acceleration capability in silicon and get, in the lower costs, kind of more edgy platforms, but it’s still pretty heavy edge oriented. And then—so it’ll be interesting to see over the next five years how that sort of pushes out to the farther edge and gets even lower power, right? And less costly. But, so there’s still a lot of headroom, and I think an opportunity for the hardware to accelerate—maybe not in the consumer space as much, but I think on the commercial side, everyone’s looking for higher performance, lower power, lower power consumption, lower cost.

Kenton Williston: Yeah, absolutely. And I think that the point you mentioned about AI is particularly a prime example of that—where a lot of accelerators are being added into, for example, the latest generation of Intel core lineup—that makes a big difference in what you can do. And, again, at relatively low power, low costs are a scenario. And I think what’s really interesting to me is, I mean, we can kind of all see where this is going, and I feel like a big part of how to have a framework that’s forward looking—like we’ve been talking about: don’t just make some bespoke thing, but make a solution that’s going to work for what you’re trying to do tomorrow as well—a lot of that has to do, in my opinion, with having an underlying platform that’s very flexible and scalable. So that you can readily say, “Okay, maybe today we’re doing this workload in the cloud. Maybe tomorrow it’s in a gateway. Maybe next year it’s on the edge.” Right? And you don’t have to feel too pinned down to anything.

Pete Bernard: Yeah. And so the management and deployment of workloads is a big deal for us at Microsoft and what we’re doing with Azure and Azure Arc and Kubernetes, and a bunch of these technologies, where people can develop and deploy applications, quote unquote “software,” and be able to manage where that software is running and where those workloads are running. And I think we’ll see even more flexibility over the years, the next few years, in that area. And that’s pretty exciting because, like you said, things are going to evolve pretty quick.

Kenton Williston: Absolutely. So, I’m curious—one of the big challenges beyond just, like, “What is the right framework to give us a forward-looking path that we’ll have this kind of flexibility?” Is not just the flexibility, but the scale, right? Oftentimes it’s relatively easy to put together a sort of a lab-based proof of concept that shows, yes: you can actually execute some AI workload to recognize whatever you’re trying to recognize. But then you get into questions of scale, and it’s not only: how do you deploy this heavy workload in a sensible way? But all the other things that come into it. Like the security, like the device management—all these other things that are really important beyond just running whatever application it is you’re trying to do. So, what do you see as some of the key pain points there, and how is Microsoft trying to reduce the friction of those?

Pete Bernard: Yeah, that’s a good question. So, yeah, you’re right. I mean, there’s the “show your boss that your camera could recognize a banana,” and then there’s actually, deploying that. And one of the things we’re trying to do is kind of minimize the steps between the demo to your boss and deployments, for lack of a better term. So, like, Azure Percept is something we introduced in March, and it uses some pretty cool Intel technology as well. And it’s really a developer kit that enables people to, yeah, quickly and easily recognize bananas and import their own AI models, or use a bunch of models out of the box. But then you can—because you’re now deploying that on Azure, you’ve got full Azure IoT, Azure IoT hub-device management, deployment—you’ve got Azure device-update capabilities, in that you’ve got everything you need, actually, to really go to production with that whole software framework that you’ve developed for your proof of concept.

So there isn’t a—you don’t have to sort of start over again when you move to production hardware, which is pretty cool. So we’re trying to give developers a way to really harvest the work that they’ve done in the POC stage, and really not have to do anything over again to get to a full deployment stage. And, like you said, the production hardware may change, and you’re going to maybe change models and get something weatherized or whatever, but the software and AI models that you develop and trained, and the way you’ve managed and deployed them—that’s all sort of production-level code. And that’s, I guess, that’s one of the benefits of working with a platform like Azure, and working with partners like Intel is that this is what we call GA—general availability—type stuff. And being able to develop and deploy on Azure kind of gets you pretty far along when you want to actually do full production deployment.

Kenton Williston: Yeah, absolutely. So, this is also bringing up a question in my mind of what kind of skill sets organizations need. So, we were talking earlier just about the fact that it doesn’t matter what kind of business you’re in—you’re in the tech business, period, because you can’t not be. I think the good news is, everyone’s got at least some sense of what that means. You know any kind of organization is going to have an IT department, and things like this, but surely there are things that are unique when you are starting to make these very complex decisions about having a highly scalable, forward-looking-in-all-these-sorts-of-good-things sort of an architecture for your IoT deployments. That gets, like we’ve been talking about, to be very complicated. So, are there any particular kinds of skill sets or resources that companies should be making sure they have shored up?

Pete Bernard: Yeah. I think one of the things we’ve seen is sort of a confluence of embedded developers, and data science–kind of developers, and AI developers. And one of the things we’re trying to do with our tools and Visual Studio and the whole  Azure platform is like—how do we enable embedded developers to become smarter AI developers, and vice versa? And so you don’t have necessarily two different types of software developers—like, one software developer can skill up and become really good at all of those things, right? To develop and train AI models, and to also write code to develop and deploy applications on embedded or edge devices. And there’s a ton of Microsoft learning—learning paths and things like that. But certainly the data science and AI capabilities are a new one that, I think, is really required for lots of companies these days.

And then, the ability to understand these platforms—common platforms and, obviously, Azure, but there’s AWS and GCP, but also things like Kubernetes and workload management. So I think all those things are kind of feeding into each other. And I think software developers, I’ve seen them be—coming myself from Phoenix, where we are firmware developers, BIOS developers, kind of a unique species—the developers today are a lot more well-rounded, I would say, and usually have a pretty well-rounded skill set. And I think AI is just the latest skill set for people to add into the résumé.

Kenton Williston: So it sounds to me like maybe part of this is reconceiving what even your tech department is, and that it’s not just IT—it needs to be bigger than that. And the other thing that’s coming to mind here is, maybe this is a really golden opportunity to think about the relationships you have with the Microsofts of the world, with the Intels of the world, and with whoever your vendors and systems integrators, and so forth, are.

Pete Bernard: Right. Yeah, I mean, there’s some incredible solutions providers out there that have done a great job taking a lot of this tech. And, yeah, you’re right—you don’t have to be a firmware developer when you work at a shoe store; you can buy a package solution for AI vision and security from a solutions provider and be on your way, right? So it does make a bit of a difference, but, yeah, I think that there’s definitely some interesting opportunities there. The solutions provider space is pretty fantastic. And I think that, you know, in the old days—like when I was working out in Silicon Valley—the IT department would be behind a Dutch door—one of those halfway doors—you’d walk through it and say your laptop didn’t work or something, and they’d take it from you and tell you that you could come back in a few hours, or something.

So, that doesn’t really happen anymore. Your IT department is about security and about productivity, and probably doing some custom application development and, hopefully, buying some of these solutions or sourcing some of these solutions and adding your own, sort of, power apps and other business logic on top for your particular business. So, yeah—I think it’s an incredible opportunity for developers these days to get out of their comfort zone a little bit, and to start experimenting with things like AI and building on top of some of these solutions that are out there that are really part of a kind of a basket of solutions

Kenton Williston: And what are Microsoft and Intel doing together to support this ecosystem of IT departments and solutions providers?

Pete Bernard: Well, there’s a ton of work, obviously, we’re doing on the platforms for manageability and security. So, one of the interesting things we’re doing with Intel with Azure Percept—we’re actually encrypting the AI models on Azure and then decrypting them on the device.

Kenton Williston: Oh, cool.

Pete Bernard: And, so, that’s kind of an interesting new thing, right? Because no one else has really done that before, and the reason is because the AI models themselves are really important IP. And so that is an attack vector, right? That someone could take some important AI model IP that you’ve spent millions of dollars developing and training. And so we’re doing encryption end to end. I’m making that invisible to the developer and working with Intel on that.

Kenton Williston: You’ve brought up a lot of really cool, cutting-edge stuff that’s really brand new—around Percept and around the Flow products, where there’s just a lot of stuff going on. And, really, I mean, honestly, even the whole Azure IoT platform is still kind of newish in the grand scheme of things, right? There’s just a ton of new capabilities coming out. What do you see in, sort of like, the medium to long term of where your priorities lie? And where you want the IoT ecosystem to be?

Pete Bernard: So, there’s a lot of, like I mentioned—the brownfield area is a big area, right? Every company has a problem, and every company has equipment. And so sort of one of the things that we’re seeing a lot of action on is: How do I leverage my legacy equipment in the brownfield as opposed to greenfield? Greenfield means buying new equipment, and brownfield means I have some—I have my box of cameras. So we’re seeing a lot of activity there, and thinking about the existing platforms, and how can I write new software and applications to work on those platforms? So, for example, EFLOW—back to that—was actually, it can be deployed on Windows IoT today. You don’t need to buy a new box with Windows IoT on it, or a new instantiation of that. It can take existing Windows IoT, and actually go download Flow as a Linux container on it today, which is pretty cool, right?

So now you’ve sort of married legacy workloads with existing equipment that’s in the field. So there’s a lot of brownfield work that’s going on right now. And people are just trying to do smarter software with a lot of their existing equipment, and thinking about, “How do I solve my problem with some of the stuff I already have around the store or in my manufacturing plant, et cetera?” And, at the same time, people are planning for the next big hardware cycle, right? And how do I use 5G and private 5G, and how do I use Wi-Fi 6, and how do I do all kinds of new things with these new vision processors, right? So that’s all happening in parallel, but I think brownfield is kind of where there’s a lot of near-term action.

Kenton Williston: Yeah. That makes sense. And absolutely we haven’t touched too much on—nor do we have a lot of time left to dig into—but I think beyond the silicon, beyond the software, the connectivity side of things has really changed a lot. And I do think 5G, Wi-Fi 6, are going to continue to unlock a lot of really exciting new possibilities—be able to do things at scale they just couldn’t do before.

Pete Bernard: Yeah. And I would say, I think my advice for folks is to kind of keep an open mind when it  comes to connectivity, right? So it’s, yes, there’s Wi-Fi, but also 5G. There’s something called LPWA, or Low Power Wireless Access. There’s—that can include NBIOT. I mean, there’s an alphabet soup here, but, like, LPWA. There’s Bluetooth low energy, which has gotten really good. So there’s lots of different ways to connect these things together, and people should really keep an open mind about what’s the best way to do that, because there’s so many options these days.

Kenton Williston: Yeah, for sure. And I think that’s part of what makes this time so exciting, is that there’s just—there are so many options and so many possibilities just coming online all the time. Well, listen, Pete, it’s really been great talking to you today. Any key takeaways you’d like to leave with our audience? Kind of the big picture of how do you make the life of an IoT developer easier?

Pete Bernard: Yeah. We really need to be—they say “customer obsessed.” It sounds a little trite, but it really does mean something. And being customer obsessed means thinking about the solutions, not just the technology. So, think about how you can help solve problems for your company or your customers holistically, and assume that there’s a heterogeneous ecosystem out there. And part of your value add is being able to glue that stuff together in a seamless way to solve some of those problems. So, really thinking at that altitude is pretty helpful.

Kenton Williston: Excellent. Well, with that, I’d just like to thank you again for joining us. Really interesting conversation.

Pete Bernard: Sure. Appreciate it. Thanks for having me

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from Microsoft, follow them on Twitter @Microsoft, and LinkedIn @Microsoft. If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time with more ideas from industry leaders at the forefront of IoT design.

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

About the Author

Kenton Williston is an Editorial Consultant to insight.tech and previously served as the Editor-in-Chief of the publication as well as the editor of its predecessor publication, the Embedded Innovator magazine. Kenton received his B.S. in Electrical Engineering in 2000 and has been writing about embedded computing and IoT ever since.

Profile Photo of Kenton Williston