Skip to main content

INDUSTRIAL

The Role of Digital Twins in Manufacturing Operations

Ricky Watts, Martin Garner

Today, companies are on a never-ending quest to cut costs and shrink production time. To achieve those goals, they increasingly turn to digital twins to pinpoint process failures before any product goes into production.

While available for years, this virtual-model technology increasingly helps organizations test performance and efficacy by transferring live data from a real-world system to its digital replica. The result: instant visual feedback that elevates risk assessment and decision-making.

In this podcast, we examine the central role that digital-twins technology plays in machine testing, simulation, monitoring, staff training, and maintenance. In addition, we detail how factories harvest mountains of data by using digital twins to pinpoint valuable information and extract maximum value.

Listen Here

Apple Podcasts      Spotify      Google Podcasts      Amazon Music

Our Guest: CCS Insight and Intel®

Our guests this episode are Martin Garner, COO and Head of IoT Research for CCS Insight; and Ricky Watts, Industrial Solutions Director at Intel®.

Martin’s main areas of interest are in mobile phone usage, internet players and services, connected homes, and IoT. He joined CCS Insight in 2009 and works on the commercial and industrial side of IoT.

Ricky joined Intel about three years ago and since then has focused on ensuring industrial edge solutions are safe, secure, and reliable. Prior to joining Intel, he was at Wind River for nine years and was responsible for the company’s industrial products and solutions.

Podcast Topics

Martin and Ricky answer our questions about:

  • (2:21) The definition of digital twins
  • (5:19) What digital twins means to the manufacturing industry
  • (9:06) The challenges of new digital technologies
  • (12:51) How to successfully implement digital twins within the factory
  • (18:12) What skills manufacturers need to be successful
  • (21:47) Tools and technologies for digital-twin adoption
  • (25:16) The ecosystem of partners making digital twins possible

Related Content

To learn more about digital twins in manufacturing, read Bringing Digital Twins to the Factory Floor. For the latest innovations from CCS Insight and Intel, follow them on Twitter at @ccsinsight and @Inteliot, and on LinkedIn at CCS-Insight and Intel-Internet-of-Things.

Transcript

Christina Cardoza: Hello and welcome to the IoT Chat, where we explore the latest developments in the Internet of Things. I’m your host, Christina Cardoza, Associate Editorial Director of insight.tech, and today we’re talking about bringing digital twins into the factory with Martin Garner from CCS Insight and Ricky Watts from Intel®. But before we jump into the conversation, let’s get to know our guests a bit more. Martin, welcome to the show. What can you tell us about CCS Insight and your role there?

Martin Garner: Thank you, Christina. Well, so I work at CCS Insight. We’re a medium-sized and quite fast-growing analyst firm focusing on technology markets. I lead the analyst research. We do industrial IoT, and I did a big piece of research on digital twins a few months ago, which is obviously part of what we’re talking about. And that is already available to Intel’s partners through insight.tech. So, at CCS Insight, I’m also COO.

Christina Cardoza: And, Ricky, thanks for joining us today. Please tell us more about yourself and Intel.

Ricky Watts: Thanks for the introduction and, yeah, Ricky Watts. I am in Intel. I work in our Federal and Industrial business unit, FIS, as we call it here. I look after the industrial solutions. So, think of me as looking at Intel Technologies, and how do we apply them into the industrial landscape, industrial market segment. So, probably everybody knows Intel makes chips for platforms for the industrial landscape. So, I’m looking at how do we take those chips and work with them as we look at our partners bringing through solutions, including digital twins, which we’re going to talk about today as well. So, it’s really about holistically matching Intel technologies into those marketplaces.

Christina Cardoza: Great to have you both today. Like I mentioned in my intro, we’re talking about digital twins and manufacturing, and digital twin is not necessarily a new concept, but it is something that’s gaining a lot of interest recently. So, Martin, I’m going to start with you for this question. Since it’s not a new concept, but I do think that there is a little bit of confusion on what digital twins are and there may be many definitions out there, so can you explain to us what a digital twin is, especially in the context of a manufacturing factory?

Martin Garner: Oh, for sure. And I completely agree. There are lots of definitions, and I think maybe Ricky will have a slightly different variant on mine, but I like the view from the Digital Twin Consortium, the industry body around these things, which is that it—a digital twin is a virtual model of things and machines that exist in the real world, but also processes that are in use in factories and things.

But, as well as that, there needs to be some sort of synchronization between the real thing and the virtual model. They need to keep each other up to date. Now, that could be real time or it could be much slower than that. And of course you need to know how frequently, and what’s the quality of the data that’s being synced between them. Now, that all sounds quite simple, but actually there are lots of layers going on within this. So, there’s identity, there’s a data feed, there’s permissions, there’s models of the data and of the machine. There’s roles and policies and analytics and AI. So there’s quite a lot going on inside that.

Now, for manufacturing what this means is it’s very easy to think of a sort of James Bond style diagram on the boardroom wall, where you can see the live state of all the operations in one view. Actually you can fit that to the different roles that exist—so the CEO gets one view, but a maintenance engineer gets a completely different view because they need different bits of information. More than that, you can kind of go on and analyze processes and machines and some of the materials being processed. You can look at wear rates. You can do predictive maintenance, process modeling, and optimization. There’s really a lot you can do here.

One of the uses I really, really like here is that you can do software testing and simulation. So, you can do a software update on the virtual machine first, validate it, make sure it doesn’t crash or break, and then download it to the real thing. You can also do staff training on machines without letting them loose, again, on the real thing.

So, I think it’s really a kind of interesting set of uses, and we’re now starting to think about the grander, scaled-up vision from here on. Because many factories are—they’re the hub of a very large supply chain. So why not have a digital twin of the whole supply chain, the upstream bits and the downstream bits, and get a much more joined-up view of what’s going on? And that might include a machine that you’ve supplied to a customer so that you can see how that’s working. Tesla does that with its cars, and I think if you’re doing as-a-service—if you’re a supplier-as-a-service thing—you need digital twins so you can see what’s going on properly and support it well. So that’s how I look at digital twins and how they fit basically into manufacturing.

Christina Cardoza: Great. Excited to dig in a little bit more about those opportunities digital twins are bringing to the factory. Ricky, are you aligned with Martin’s definition of digital twins?

Ricky Watts: I think Martin had a really good overview of what a digital twin is, and I completely agree with everything that he said. I think the one thing I would add, Christina, is if I think about digital twins and what has brought about—why digital twins exist, and where does a digital twin come into the picture, and really it relates to data. So, manufacturers, and they are creating enormous amounts of data from their machines. Now, that data is being taken off and analyzed and looked at to what’s going on with the machine, what is the data telling me? And I think, as Martin mentioned, in terms of what a digital twin is—which is a digital representation of the machine—it really starts with that data.

So, I would just add to Martin and say, as we’re starting to see the rise of more and more data coming out of the factory, what we need to do is be able to intelligently understand that data before we apply it. And I think that’s the real key, is as I’m taking data off and I’m making some sort of assumption based upon that data, I’m analyzing that data, how do I analyze that data? Well, really that’s what a digital twin is, to some extent. It’s a way of representing the data as it’s coming out of the machine. Make some assessment of that and understand it before you go and apply an output or an outcome of that data.

So, I look at that as kind of like the source of where we’re coming from, and it really relates to AI and all of the things that back up what goes into a digital twin. So, again, I just add that little bit to what Martin said in terms of the digital twin, and I think that’s why it’s important. Now the other thing that Martin mentioned, and I think this is really key, is we often think of digital representations of data, really, at a machine level, or at a very small part. What we’re now starting to see is the evolution of these things coming together. So, as more and more data becomes available, more and more digital twins come available, what we want to do is we want to say, well nothing works in isolation, to some extent, everything works with somebody. We work together. Machines work together to create an outcome.

So, that’s really where you start to see supply chains and things coming in as well. And I think what we’re starting to see is the next evolution that comes even beyond that is really going right through your supply chain to your customer and really understanding: Do I have a digital twin of the person who’s buying from me? Do I have a digital representation of the person who’s investing in me? What is it that they need? I mean, Martin talked about a CEO view, a CTO view, an engineer on the shop floor. Well, I’m an investment person investing in that business. I want to see a digital twin to understand what’s coming out—that represents the value that I’m putting in through my investment. And I think that goes on now.

So, I talked a little bit about the future, but, again, bringing it back to where we’re at right now, digital twins really allow us to analyze and use that data in a way that makes sense for manufacturers to apply that data into their own particular environment. And I think, Martin covered that pretty well.

Christina Cardoza: So, you brought up a couple great points, Ricky, especially around the data—how it’s helping manage and collect all of this data. I know data management has been a big problem in manufacturing with all of the influx coming from these devices and having to manage it all. And so you also mentioned artificial intelligence and machine learning. While all of these are bringing great opportunities to the factory, they’re also bringing a lot of challenges too. So, Ricky, can you talk a little bit more about the challenges that come with these new digital technologies?

Ricky Watts: Well, we could be here for a long time on that one, Christina, but I’ll try and summarize as best as I can. Manufacturers are not generally people that really understand AI and machine learning. So, I think that’s one of the challenges. We do have a skills gap to some extent. Manufacturers make things. They make widgets or they make whatever comes out their factory, cars, etc., etc. So, when we’re introducing these types of technologies and we’re using this type of data, we’re getting into a completely different skill set. Data scientists come in—that’s a term you’ll hear—so, how do I analyze that data? What are the algorithms that I need to apply to that data to make sense? And that kind of goes back to the digital twin, to some extent, which is a representation of that.

So, in a sense, that’s one of the key challenges for manufacturers today, is how do I implement something like that within the workforce that I’ve got today that makes sense? So there’s one challenge right away. The other thing, of course, is this is all relatively new. How do I trust that data? What does trust mean on that data, and how do I apply it? So, there are a number of challenges—security, all these things start to kind of evolve into this as you start to bring in more and more compute at the edge.

And I think those are some of the things that I know we’re certainly working on here in Intel, is to how do we simplify and make this easier for manufacturers to consume some of these technologies? So, building partnerships and ecosystems to bring in the infrastructure. What we are really looking for, very simplistically, is a plug-and-play type of approach. I want to be able to take something in relatively simply, put it in there, generate the data, collect the data, put it into a digital twin, and get some sort of outcome that’s relatively simple. I’ve said words there, Christina, that honestly there’s a lot that goes on behind the scenes to do that.

But I think those are some of the challenges that we have with things like AI and ML as well. And many manufacturers tell me, this all sounds great, but I don’t know how to do it. So, it makes—it’s really difficult. I mean you imagine if you are a small manufacturer in India making something, you’ve got to appropriate these things to the type of manufacturing environment as well. We’re not all a super-large car manufacturer, as an example; there are small, medium businesses that represent a huge amount of the—what we would call the industrial footprint, and they do not have the definition and the scale. So, bringing also scale into those solutions as well. Scaling a digital twin for a car manufacturer versus scaling a digital twin for somebody who makes screws for a car manufacturer is a different thing.

So, again, scale, complexity—simplifying these things out and then making sure that we empower not just manufacturers, but we need to empower an ecosystem to be able to go out and service these things as well. We need people that can go out and install them, work with customers around tuning the models—as you mentioned—machine learning. You’ve got to create the models to do that.

So, there’s a whole ecosystem of opportunity here around this that we’ve got to promote and work with it. So, hope I’ve given you some of the clues. There’s many more challenges as well, but these challenges are being overcome as we work together with a very powerful ecosystem to do that. I’m sure Martin has some similar views to me on some of these things, and the way he sees them—he sees a lot of this stuff going on as well.

Christina Cardoza: Absolutely. And given all the challenges you mentioned, Ricky, and you knowing that it is hard to get started, Martin, I’m wondering if you can talk to us about how to successfully implement digital twins in the factory—not only in the short term, but how do you make sure the short-term benefits you’re getting also will go long term, and you can future proof your investments today?

Martin Garner: Well, it’s a really good question Christina. And as I think just building on what Ricky has just said, I think it’s quite clear that digital twins—at least the full vision of digital twins—is quite a long-term project across both OT and IT. And it’s a really big part of digitalization of a factory’s operations, and so on. So, it’s really not a quick fix for anything.

And one of the other challenges is that the current economic climate is a bit—a bit not very happy at the moment. And so, in that climate it may mean that some companies hesitate to step into a bigger long-term project where they’re perhaps not comfortable, they don’t really know where they’re going with it, and they may feel that now is not the moment to make that step. But actually there are some very good uses of digital twins that have really good payback times, and I’d call out predictive maintenance and software testing as two of the key ones. I’m sure there are more.

And so, for anyone who’s interested in using digital twins for short-term gain, it can be done. And the trick is to make sure that you get a properly architected system, which is open enough to build up, as Ricky said, the ecosystem, and plug in other machines and expand towards the fuller vision, and then build that out progressively when you are ready to do that. But go for the short-term things first.

Ricky Watts: Yeah, and I think, Christina, just add to mine, I think you’ve raised a really important point. I think the world is going through a number of challenges right now. I think we see them everywhere, depending where you are in the world—whether it’s oil and gas industries and the challenges of the energy market; whether it’s geopolitical issues; whether it’s whatever it is—there’s many issues that—well, I think we’re all seeing right now, and I think when I look at digital twins and this word and this concept around what we’re doing, it’s great, but you’ve got to get practical. What can I do with something today that’s going to give me a benefit tomorrow—not next week, not the week after, not next year.

The world can be full of hype, we know, so there’s a lot of potential in everything that we can do. I see it, but we’re technologists. We see this huge potential. These manufacturers, they’re very much focused on how they’re going to survive, probably in a very tough fiscal environment for the next few years. So it really is around getting very tactical.

Predictive maintenance is a great one. Let’s take that and unpack that. If I’ve got a digital model which represents something of my machine, and my machine is making something in the factory—shoes for example, I don’t know, sticking things together—if I’ve got a digital twin and that digital twin is telling me my machine is going to fail, then I can go fix something before it fails. If I do that, I keep my factory operating. I’m producing my goods for longer periods of time, which makes me more competitive.

So, I think as we look at digital twins and the breadth of digital twins, I think that’s great, great vision, great views, and we as industry people will continue to drive that, because we understand what that future represents as we do that. But right now, I think focusing around the needs of manufacturers today, try—“KISSes” it. They call it the KISS principle: “keep it silly stupid.” So—simple, stupid, something like that.

So, I think that’s really interesting, which is, let’s make sure that we bring these things in, do it holistically, focus on something that’s going to add near-term value to a manufacturer, and that’s two benefits. One, is it solving a problem for today? And the second thing, going back to what I said earlier, was it allows manufacturers to themselves start to learn as well and become skilled. But they do it in a simple way. We’re not trying to introduce this huge idea to them that’s very hard for them to consume. They don’t have the skills. Here we say here’s a very small footprint; it’s doing something in your factory, and you can tool yourself up. It gives you a near-term outcome to address some of your near-term challenges, but also long term it allows you then to expose your workforce to the use of data in those environments.

So, I think, as Martin said, I think actually this is going to represent some opportunities in the near term that actually could benefit us in the mid- to long-term with the progress toward more expansive use of digital twins.

Christina Cardoza: I love that point you made about introducing this slowly to manufacturers so that they can build up their skills over time. And I know in the beginning you mentioned you have to make sure you have the right skills and the right people to make this possible. I think every industry, especially in manufacturing, is facing the global skills shortage—trying to find people to do this, people with the right skills to do this. So, Ricky, can you expand a little bit on that? What are the skills and the people that manufacturers need for this, and how do you slowly introduce this to them so that they can start getting those skills?

Ricky Watts: Yeah, I’m going to answer that in two ways, Christina, as well. So, you know, we mentioned some of the skills that—what we call data-driven skills, data scientist skills. But I think we recognize as an industry that that’s almost an impossible task to some extent. You can’t create data scientists en masse. So, what can I do to effectively turn maybe a process-control engineer that’s working in an oil and gas or a chemical plant, how can I turn him into a data scientist without having to do 10 years of training and repurposing him and doing—creating all of that tooling?

So, we are creating tools and capabilities in the background so that actually what we can do is almost in a way repurpose the skills of that process-control engineer and say, “All you need to do is tell me what’s good and what’s bad in something, okay? Use your skills that you have to tell me what’s going on.” And then we can apply that to the data models themselves and the digital twin models in a very simplistic way.

So, in a sense I think skills are going to be needed; you’ve got to get more orientated around IT and OT skills from a compute perspective: How do you install compute? How do you look after and you work closely with your IT departments? But let’s not lose the benefit of what those process-control engineers or those engineers are doing. They know the outcome. They know when something’s wrong in their manufacturing. They have those skills in their brains. So, what we want to do is be able to take that out and turn that into a digital model without having to create them as a data scientist, so we’re creating tools and capabilities to be able to do that.

So in a sense we need them to retool themselves around obviously modern architecture and modern ways of doing things, but actually at the same time we are going to give them the tools to use the skills that they have today so that they can apply that in a much more simplistic fashion, so that I don’t have to effectively retrain this person for 10 years. I can just say, “Look, go onto this terminal in your manufacturing environment, identify something good, identify something bad, or tell me when you hear this noise, or when you do this something’s going to go wrong with that machine.” We can translate that into compute code that sits inside the digital twins and our models and our AI, that then recognizes that as an issue. And then, of course, once you give the machine the data or clue, it can then go in and start doing it.

So, you’ve got two angles to that. One is obviously keep reskilling people in terms of the compute skills that they’re going to need. But at the same time how do we translate the skills that they have and make that easy to consume for the models? So, there’s two areas that we’re doing that and actually being very effective. You’re going to see a lot of progress, particularly on the second one, where what we really want to do is effectively repurpose the skills that they have, but be able to bring them out and create those digital models that we’re looking for.

Christina Cardoza: So, I’m interested in learning more about the tools and technologies that we just talked about that’s making this all possible. And we introduce a lot of new digital tools in the beginning, bringing new opportunities to manufacturers as well as challenges. And I think we as people just have a tendency to jump on the latest and greatest new technologies that come out. But, Martin, I’m wondering, what do you think is really necessary to make this possible? How do you leverage the existing infrastructure and technologies you have, and what really are the new tools and technologies you need to be implementing?

Martin Garner: Well, yeah, and so Ricky has talked about a good number of them already, especially the use of the data, the analytics, and the machine learning heading towards AI. I think there’s a couple I’d just add in to that part of the discussion. So, I know from doing research in this area that one of the bits people find hardest is just getting the data feeds organized and set up in a way so that the data is compatible. And it’s because different sensors and different machines present their data in a whole variety of different ways. Even the same manufacturer presents data in a variety of different ways because they weren’t expected to have to be compatible, and silly things like time stamps become amazingly challenging to organize properly.

And that’s really why it’s better to start small and simple, get the hang of something, get some value from it in one small area, and then progressively build it out from there. The other bit which I just add in is that factories are complicated things. They have a whole range of different technologies and machines, and different technologies are all levels of the technologies stack—from the connectivity all the way up to the AI. More than that, they have a unique mix of machines, so it’s very hard to do any kind of templating. And what that means is that for a larger system we may well need quite a lot of systems-integration work to really get the value out of it. So that there is quite a lot. I think you can start small and start getting value, but it quickly becomes a bigger project as you look to scale it up.

Ricky Watts: We’ve got to look at how are we creating uniformity around the machines as well. We’ve had what we would call appliances in manufacture—so, vertical machines that do certain things. One of the things that we are doing here is working with the industry to create abstractions. So, effectively what we’re trying to do is create uniformity through standards that exist inside the factory. So, using universal languages such as OPC UA so that we use a universal language for the machines to talk to each other so every machine understands every other machine to some extent.

It’s a big job. We create a data model of that machine that exists so that we can understand and categorize that in the way that they talk together as well. And that’s really about some of the abstractions, and you’re starting to see the industry move forward—particularly in the process industries—where they want to create this uniformity because they recognize some of those challenges.

Martin Garner: And the great thing about doing that is it turns it from a sort of stove-piped, vertical, proprietary thing into much more of a sort of horizontal platform approach, which can be—is much, much better for building out scale across manufacturers, supply chains, different sectors, and so on. It’s just an all-around better approach isn’t it? If we can get there.

Christina Cardoza: Absolutely. And you mentioned OPC UA. We’ll be talking to the OPC Foundation on a later podcast on this topic, getting industrial communications and these devices to talk together. But I know Intel has been a huge leader in making some of this possible—democratizing the tools and the technologies for domain users and ensuring that machines and devices can talk to each other. So, Ricky, can you talk a little bit about the Intel efforts on your side: how the company has been working to make digital twins successful along with some of those partners you just mentioned?

 Ricky Watts: Yeah, no, and thanks, Christina, that’s a good question. So, you mentioned—first thing is look—what Intel does extremely well, in addition to obviously building those wonderful compute platforms that run these things at the edge of the network, is we look at scale and creating standards. So, one of the things that Intel’s doing particularly within our industrial business unit is—you mentioned OPC UA. So, we’ve been part of those foundational efforts working with the industry partners to create coalitions of people to identify how do you create this standard?

We’ve been doing that in the oil and gas industry around what they call the OPAF, the Open Process Automation Forum. So, this is creating that abstraction, that horizontalizing that Martin talked about. And then, what are the things that we need to do within OPC UA, that language. A language is only good if it allows you to communicate in the ways that machine wants to communicate.

One of the areas that’s very important in manufacturing is real time. We call that time-sensitive network. Now, in legacy things we’ve had legacy protocols that do that. Well, those legacy protocols really are isolated. So, a machine becomes isolated to create that type of environment, that real time. It can’t interwork with some of the other machines because of the nature of that machine. Well, we’ve been working with OPC UA to extend OPC UA to cover real time, so that you can do real-time communications in a single network environment. You’re not going to lose your real-time capabilities. The language supports that.

So, what has Intel been doing? It’s been looking at the compute platforms, making sure that we’re bringing through the technologies that are going to be needed for those manufacturers. So we’ve been adding things such as TSN, time coordinate, and compute. You go back to something Martin talked about, time stamping. Well, we need the compute platforms to be synced on atomic clocks so that a data on one platform is linked to a data on another platform, and those are completely linked in sync. So that’s time-coordinated compute, functional safety—many other things that we’ve been doing in our platforms to bring through those capabilities.

As we bring them through, we’re enabling the software ecosystem to be able to use those capabilities, okay? So, making sure we validate with Linux operating systems, with Windows operating systems, with virtual machines, with Kubernetes—all these wonderful things that you hear which are basically software infrastructure layers that allow us to run the applications and services from companies like Siemens. So, the PLCs, the soft PLC, the SCADA systems—all these wonderful languages. So that’s that abstraction, the compute at the bottom with all of the functional companies and the security components and enabling that ecosystem in the middle with the software vendors so that they can pick up that capability.

And then working with what I would call the application, or the OEM vendors, so that they can run an application and they’ve got the ability to identify through APIs, through the capabilities, through the calls that they can actually run those applications with what they want, and we can take the data off. So we’ve been: compute platforms, scaling with ISVs, creating what we would call working teams, and creating industry forums to create scale, building standards, applying those standards, and then of course working very much with the end-user community to make sure that we’re not creating Frankenstein’s monster.

Christina Cardoza: Well, I’m excited to see what else Intel comes out with and how you continue to work with your partners. Unfortunately we are running out of time, and I know there are tons of different areas and topics in this one space that we could touch upon that we just unfortunately don’t have enough time to today. But before we go, I just want to throw it back to each of you for any final key thoughts or takeaways you want to leave our listeners with today. So, Martin, I’ll start with you.

Martin Garner: Yeah, sure. And I think Ricky mentioned already the full vision of digital twins, and we have seen from some people sort of planetary-scale weather and geological systems which help us understand global warming and things like that better. Against that, there is a huge number of smaller companies who really don’t know where to start. And I think we’ve already touched on this—we need to make it easier for them and obviously worthwhile to invest in this. And that means really focusing on the shorter term: how to save money using digital twins in the next quarter, and also how to make them easier to buy and set up. Could it be just a tick box on the equipment order form, could it be as easy as that? And then a little bit of configuration when you get the machine, and it all arrives. The vision is one thing, but we need to pull along the mass market of people who might use them as well. And we can’t just do one or the other. We need to do both.

Ricky Watts And I’m going to throw few on that to what Martin said as well. I think he’s absolutely spot on. My final thoughts are—is, yeah, there’s huge amounts of things that you can do going forward, but let’s focus on the things that you need to do in the near term, and really don’t hesitate to reach out and start your own journey. But keep it small, keep it simple. And we are working around those things such as predictive maintenance. So, getting into this, starting to understand what you can get benefits from, but keeping it small, keeping it contained. And we do have solutions that are available to start you on your journey, but really very much focused on what your problem is today.

Christina Cardoza: Great points, and it’s been a pleasure talking to both of you today. Thanks so much for joining me on the podcast, and thanks to our listeners for tuning in. If you like this episode, please like, subscribe, rate, review, all of the above on your favorite streaming platform. Until next time, this has been the IoT Chat.

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

This transcript was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for insight.tech. Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza