Skip to main content

AI • IOT • NETWORK EDGE

Remove Friction from IoT Development with Microsoft

Internet of Things, IoT interoperability, IoT friction, IoT development

Companies are putting IoT application projects on the fast track, and the pandemic has only accelerated that trend. But does every organization have the skill sets and resources they need to scale up these projects smoothly? Where does AI fit into the picture? And how do you balance old hardware realities with new software innovations?

We talk with Pete Bernard, Senior Director of Silicon and Telecom in the Azure Edge Devices Platform and Services Group at Microsoft, about the major trends and challenges in IoT development, how to make the life of an IoT developer easier, and why that matters.

What are some of the major trends or challenges in commercial IoT applications these days?

It’s a very solutions-oriented market out there, especially in what I would consider the edge ecosystem. If you talk to commercial customers, they have some pretty complicated problems, and those problems are not solved with a single device or a single app. Quite often it’s about how these systems work together. How do they work together securely? How can they be managed? How can they be deployed?

It’s a heterogeneous space. So, for example, there are AI models that might be trained on the cloud and Azure, and then get deployed to the edge. But how are they optimized for the silicon on the edge? We’ve been working a lot with Intel® on OpenVINO and a platform we call Azure Percept, which we launched in March. It’s just one example of where you really need to be able to take advantage of the silicon characteristics and the capabilities of silicon to get the performance and the power to solve problems.

Can you talk about the rise of these really intensive workloads and why they need to be done at the edge?

We’ve seen the evolution of standalone systems, or disconnected systems, and systems connected to the cloud that can just send data up to simple sensors. And now we have the cloud talking to the edge, and the edge talking to the cloud. Basically, you can run a lot of compute on the edge and asynchronously from the cloud, and you have to really figure out how those things can work together.

“What we’re trying to do is take the #IoT friction out, and give people the power of all the optionality that’s out there without making it too complicated.” –@digitaldad @microsoft via @insightdottech

There are a lot of new scenarios out there where you need some pretty high-performance edge computing, and maybe that’s for privacy reasons, maybe that’s for bandwidth reasons. But it’s exciting to see some of the innovation that’s happening out there, and the way people are using it.

Has the pandemic changed how companies are considering their IoT applications?

I think so. I think the internet is one of the heroes of the pandemic, in that it’s kept us all connected and working throughout. But everything has accelerated. All the experiments that were cooking up that we were going to do two or three years from now—they’ve all been deployed.

And you’re seeing a lot more AI-vision work. Automation has really accelerated a lot through the pandemic. And there’s a lot more optimizing of that around healthcare—using a lot more information at the edge to make sure people can have a smooth, authenticated experience. The pandemic has just hyper-accelerated a lot that’s been in the pipeline.

How do we build on this acceleration going forward? Particularly in terms of bringing systems together in new ways.

The systems that are getting deployed can’t be one-off, bespoke systems, right? You can’t hard-code your solution this year, and then hard-code another solution next year. You have to think about what problems you’re going to tackle today, versus two or three years from now, by adding more and more capabilities.

One example is point-of-sale terminals in retail—they’re pretty prevalent right now. People are saying, “I have these point-of-sale platforms—what else can I do with them? Can I add AI vision to provide some kind of security monitoring in my store?” So, we’re seeing humble platforms like that actually becoming edge endpoints in and of themselves that can do quite a bit of compute.

Another thing that we’re seeing is that a lot of people have legacy equipment. There was a supermarket we were talking to recently, and the first thing they said was, “Last year we bought all these cameras, and they’re sitting in boxes.” So, how do you connect legacy equipment and make that more AI capable, more secure, more manageable?

You must think about it systemically. If you have some problem you need to solve, what equipment do you currently have that you can leverage to do it? And then, how do you to scale it when, inevitably, the chips get better and faster and cheaper. So, next year, or the year after, there is going to be even better equipment to connect into the system.

What is Microsoft doing to help with all that?

Obviously, Azure is an incredible cloud business. I think 95% of the Fortune 500 run their businesses on Azure. And what we’re doing from our group is really helping on the edge.

We have something called EFLOW, in which you can run Linux workloads on Windows. So you can have a secure and managed Windows platform—that everyone knows how to manage and deploy—and on top of that you can now run legacy Linux workloads that can be run from the cloud, as opposed to having a separate box that runs Linux. So that’s one example of something that we’re doing out of our team that’s really helping customers solve problems with the equipment they have—with a little bit of new software—which is pretty cool.

The good news is that there are lots of different ways to solve problems in a very cost-efficient, very low-CapEx way. But the downside of that—and I guess that’s why we get paid—is that it’s complicated. You have to take the friction out—with developer tools, with platforms. That’s really what we’re trying to do, is take the friction out, and give people the power of all the optionality that’s out there without making it too complicated.

Would you say that IoT applications have really evolved to a place where it’s more about the software model than the particulars of the hardware?

I think so because the software has to really work across many different pieces of a system. Different pieces of hardware all have to work together, and that’s where the software really comes into play. If you have your business logic and your power apps and all that running on top, it’s a really software-driven exercise. Something like 7% of all Tesla employees are software engineers. And I think in General Motors it’s something like 1% or 2% are software engineers.

I do a lot of mentorships for college students, and they always ask me, “What companies should I work for if I want to be in tech?” And I always tell them, “Well, pretty much any company is in tech these days.” Everyone has to be tech capable, and you have to have software capability built into any company. So from a career perspective that’s exciting for a lot of people coming out of college, because they can pretty much work anywhere if they have software capability.

We are taking advantage of a lot of new semiconductor capability—on lower power, higher tops, higher performance, lower cost—so there’s still a lot of headroom there, and, I think, an opportunity for the hardware to accelerate. Maybe that’s not in the consumer space as much, but on the commercial side everyone’s looking for higher performance, lower power, lower power consumption, lower cost. And that’s going to continue. But really it’s the software that can unlock some of these amazing scenarios.

What do you see as some of the key pain points around scaling up? And how is Microsoft trying to reduce the friction there?

That’s a good question. There’s the “show your boss that your camera could recognize a banana,” and then there’s actually deploying that. And one of the things we’re trying to do is minimize the steps between the demo to your boss and that deployment. Azure Percept uses some pretty cool Intel technology; it’s really a developer kit that enables people to quickly and easily recognize bananas and import their own AI models or use a bunch of models out of the box.

We’re trying to give developers a way to really harvest the work that they’ve done in the POC stage, and not have to do anything over again to get to a full deployment stage. The production hardware may change—you’re going to maybe change models and get something weatherized, or whatever. But the software and AI models that you’ve developed and trained, and the way you’ve managed and deployed them—that’s all sort of production-level code. And being able to develop and deploy on Azure gets you pretty far along when you want to actually do full production deployment.

IT deployments can be incredibly complicated these days. Are there any particular skill sets or resources that companies should be making sure they have shored up?

One of the things we’re trying to do with our tools and Visual Studio and the whole Azure platform is to figure out how to enable embedded developers to become smarter AI developers, and vice versa, so you don’t necessarily have to have two different types of software developer. One software developer can skill up and become really good at all of those things—developing and training AI models, and also writing code to develop and deploy applications on embedded or edge devices. But certainly, the data science and AI capabilities are the new skill sets that are really required for lots of companies these days.

In the old days the IT department would be behind one of those half doors, and you’d walk through it and say your laptop didn’t work, or something. And they’d take it from you and tell you that you could come back for it in a few hours. That doesn’t really happen anymore.

Your IT department is about security and about productivity, and probably doing some custom application development and, hopefully, buying some of these solutions or sourcing some of these solutions and adding their own power apps and other business logic on top for your particular business. I think it’s an incredible opportunity for developers these days to get out of their comfort zone a little bit, and to start experimenting with things like AI.

Where do you see the IoT ecosystem going in the medium to long term?

Every company has a problem, and every company has equipment. And so one of the things that we’re seeing a lot of action on is, “How do I leverage my legacy equipment in the brownfield as opposed to the greenfield?” We’re seeing a lot of activity there: “How can I write new software and applications to work on those platforms?”

And, at the same time, people are planning for the next big hardware cycle: “How do I use 5G and private 5G? And how do I use Wi-Fi 6? And how do I do all kinds of new things with these new vision processors?” So that’s all happening in parallel, but I think brownfield is where there’s a lot of near-term action.

The connectivity side of things has really changed a lot. Can you talk a bit about that?

I think my advice for folks is to keep an open mind when it comes to connectivity. So, there’s Wi-Fi, but also 5G. There’s something called LPWA, or Low Power Wireless Access. There’s Bluetooth Low Energy, which has gotten really good. There are lots of different ways to connect these things together, and people should really keep an open mind about the best way to do that, because there are so many options these days.

Any key takeaways for how to make the life of an IoT developer easier?

We really need to be “customer obsessed.” It sounds a little trite, but it really does mean something. Being customer obsessed means thinking about the solutions, not just the technology. So think about how you can help solve problems for your company or your customers holistically, and assume that there’s a heterogeneous ecosystem out there. Part of your value add is being able to glue that stuff together in a seamless way to solve problems.

Related Content

To learn more about overcoming IoT development challenges, listen to our podcast Take the Pain Out of IoT Projects with Microsoft.

About the Author

Kenton Williston is an Editorial Consultant to insight.tech and previously served as the Editor-in-Chief of the publication as well as the editor of its predecessor publication, the Embedded Innovator magazine. Kenton received his B.S. in Electrical Engineering in 2000 and has been writing about embedded computing and IoT ever since.

Profile Photo of Kenton Williston