0:06 hi everyone uh my name is Ivan and I'm a
0:09 Senior Solutions engineer in n gyles and
0:11 in this talk there's going there's going
0:13 to be a lot of buzzwords that we all
0:17 like together in one place AI iot llms
0:20 foundational models we have it all and I
0:22 think it's very exciting to look at how
0:23 we can combine this all together to
0:25 drive more productivity
0:28 growth and that's another intro
0:30 slide so I'll start with a question
0:31 there's going to be quite a few
0:33 questions throughout my presentation
0:35 first question is what is I I think
0:37 people who have been in the industry for
0:40 20 years still have different answers to
0:41 this question because we just cannot
0:43 figure out it's so many things at the
0:45 same time and everyone has their own
0:48 answer is it is it a fridge that has a
0:50 door that can tell you out of beer but
0:52 it has a back door that will send your
0:55 data out like I don't know I think from
0:56 the engineering
0:59 standpoint I formulated for myself at
1:02 least that is the sensing intelligence
1:04 and connectivity and then you can apply
1:07 to different things and then we have
1:09 sort of figured out the sensing part a
1:10 long time ago we have different ways to
1:13 capture the the the the sensors and all
1:16 sorts of things from around us we have
1:17 figured out connectivity honestly
1:19 there's a huge market and competition
1:21 Technologies low power Technologies like
1:23 Laura 4G and things like that so this is
1:26 kind of figure out pick and place but
1:28 the intelligence part is something that
1:30 has been progressing with the type of of
1:31 capacity that we have available in iot
1:33 devices and I'm going to talk about it
1:35 from the hardware sto later but I think
1:38 the focus for us is now adding more
1:40 value intelligence
1:44 part and the question is what is the
1:47 most important graph in Tech
1:51 business and it's not this one not this
1:54 one it's this one it's the Gardner hype
1:56 cycle and then you can find this hype
1:59 cycle for a lot of different um you know
2:00 Industries a lot of different topics
2:02 this particular case is the hype cycle
2:05 for artif artificial intelligence so for
2:06 those of you who don't know this hype
2:09 cycle kind of tells you at which stage
2:11 of adoption and closeness to
2:13 productivity different Technologies are
2:16 and how soon they're expected to reach
2:18 this further desirable ploto of
2:20 productivity that where the computer
2:22 vision is right now it means that we now
2:24 can take this and apply it and get value
2:26 immediately and it's understandable and
2:28 people understand how to use it and why
2:30 they want to use it
2:32 but there's an interesting Aji part
2:34 which I would argue actually goes in
2:36 hand hand inand with computer vision um
2:38 and that's what we're going to focus on
2:41 today and you can also notice that the
2:43 while it's kind of in an early stage
2:45 it's it's going to reach the plateau in
2:46 less than two years this technology is
2:48 really rapidly evolving and we see a lot
2:50 of innovation
2:53 happening so another question I have a
2:55 lot of questions in this presentation
2:57 what does the real world look like right
3:01 if we look around we have light sound
3:03 smell all sorts of signals the sensing
3:06 part that we have already figured out
3:08 and what we do as humans if we look
3:10 around like I can smell something and I
3:13 can tell if it's a burger or a pasta or
3:14 I can look at the light and say if it's
3:17 red or green so our brain is taking a
3:19 signal that a sensor usually takes makes
3:22 an inference based on it and determines
3:24 what it is does a classification or
3:27 regression or something like that
3:29 now historically we've been trying to
3:31 produced this on all sorts of devices
3:33 including iot and that's what we call
3:35 heris intelligence heuristical
3:37 programming we were seeing the data come
3:39 in we were knowing what problem we want
3:41 to solve and we were building rules
3:43 around it and then we're expecting the
3:44 system to kind of tell us what's going
3:45 on based on those
3:50 rules but the new approach this ai ai at
3:52 the edge is when you don't want to set
3:54 the rules manually but rather you would
3:56 give to the program and algorithm all
3:58 the data that you have of all the
4:00 classes or problem that you want to
4:02 solve and you let the algorithm figure
4:05 it out on its own this is way more
4:08 efficient but first just to keep in mind
4:11 these two things data is Paramount in
4:13 both cases and iot is all about data we
4:15 get a lot of data we get it from all
4:16 kind of sources but what do we do with it
4:18 it
4:20 right so why is this fisal approach
4:22 lacking and why are we starting to see
4:24 more and more limitations now of people
4:25 who are working with these approach is
4:28 that the real world is fuzzy right like
4:30 we can say Okay I can figure out these
4:32 two edge cases that I've seen on my
4:34 production line and I will set the rules
4:37 for them but what if something that I
4:39 didn't know about happens how do I know
4:41 detect this this thing that I will have
4:43 to fail again and now learn about it
4:45 like we cannot account for everything
4:47 her istics are labor intensive again to
4:49 collect and model all the situations
4:52 that we want to um anticipate it
4:54 requires a lot of domain knowledge and
4:57 you know just time to do this and then
4:59 the insights are being left on the table
5:01 right again iot data we say we want to
5:03 detect when a temperature crosses the
5:05 threshold but in the same data that I'm
5:07 collecting potentially there's so many
5:09 more different problems that I can
5:11 extract and solve with exactly the same
5:13 inputs but we don't know how to how to
5:14 approach it because it's a bit more
5:17 complex than a threshold
5:21 right so this QR code will you'll see a
5:22 couple of times I challenge that
5:24 everyone has to scan at least one in the
5:28 presentation and sign up uh this is a QR
5:29 code that will let you register an edge
5:32 inut which is a platform that we provide
5:34 that lets you build a machine learning
5:36 algorithm a machine learning Pipeline
5:39 and deploy it on an edge device it's a
5:41 it will be an algorithm that will take
5:43 sensor data in any sensor data that you
5:45 like any sort of problem that you would
5:47 like to solve you will build it using
5:49 cloud of course training models requires
5:51 a lot of compute but then you'll be able
5:54 to deploy it to uh any sort of Hardware after
5:55 after
5:57 that from high level that's what it
5:59 looks like and that's that's that's a
6:01 normal engineer engering process right
6:02 it's an iterative process so the
6:05 platform allows you to do this machine
6:07 learning engineering with embedded
6:09 systems in mind iteratively and quickly
6:13 without having to um without having to
6:16 pay attention to like things that very
6:18 that are very hard to
6:20 do second question we always get is like
6:22 where can I deploy machine learning with
6:25 this platform so I made this graph to
6:28 tell that you can deploy on PC on MCU
6:33 npu MP whatever you like but our CTO lik
6:36 say is basically anything on% if you can
6:39 compile C++ code this can run
6:42 there and that's another kind of
6:44 overview of what sort of Hardware I'm
6:46 talking about any hardware of course
6:47 like for smaller Hardware you will
6:49 consider different problems you will not
6:51 run a computer vision model on a
6:53 microcontroller unit but all sorts of
6:55 sensor data can have machine learning
6:57 applied to them and value extracted from them
6:59 them
7:02 right so I think what I'm trying to say
7:05 with this is that Ai and iot are having
7:07 sort of a love story because it's a
7:09 perfect match um there are a lot of
7:12 reasons and I'm I've been having quite a
7:14 few conversations today I'm saying the
7:16 most important thing I think is that
7:19 you're saving on energy costs and on
7:21 bandwidth these are the two most
7:23 expensive things in iot in my opinion we
7:26 have to pay for ingestion uh to the
7:27 cloud if you stream raw data to the
7:29 cloud from your huge deployment of sensors
7:30 sensors
7:32 that's a huge cost energy costs of
7:34 devices that are either battery powered
7:35 that you have to replace battery every
7:37 couple years is also very expensive so
7:39 with running the models directly on the
7:41 edge instead of running a camera that
7:44 will stream all the frames to the cloud
7:46 and run big model there which means you
7:51 have to send 30 720p images at uh per
7:52 second to the cloud right it's enormous
7:53 amount of
7:58 data I think the same point as that is I
8:00 the AI allows you to extend the
8:02 possibilities of what you thought is
8:05 possible with iot so before this kind of
8:08 camera that has an Nvidia GPU inside you
8:10 can only imagine it having connected
8:12 directly over ethernet to your local
8:14 network what you can do now
8:18 is connect it via Laura back hole for
8:20 example and Laura doesn't allow you to
8:23 stream images Laura allows you to send
8:27 one short 20 124 by message app so you
8:30 do all the processing on the camera
8:32 and you send one message yes or no over
8:36 Laura so you can now use camera systems
8:38 in your iot low Power iot
8:41 Systems right but let's take a little
8:43 step back and actually bring back the
8:45 point that is shared between all the
8:47 approaches and Engineering that we're
8:48 trying to do with sensors everything
8:51 starts with data we don't have a use
8:54 case or or anything to do without
8:58 data gen right this is a big thing in
8:59 the last in the last several years I
9:02 think it's it's crazy that I can now
9:04 write a prompt like this and get this
9:06 looks exactly the same right this
9:09 absolutely similar get this slightly
9:12 cartoonish image out uh but still it's
9:15 like representing the message but there
9:16 are other models with the same prompt
9:18 can generate a photorealistic image that
9:22 now is resembling the reality so we can
9:25 use gen capabilities to augment the data
9:27 sets that we have because collecting
9:28 data and finding the ways how to collect
9:31 the data is often a very hard
9:35 task um so what we have added to the
9:37 platform and I'm going to show now time
9:39 series data but with images we have the
9:42 same is the ability to use these big
9:45 models not to do your job not to solve
9:47 your problem but to help you build a
9:49 robust solution for example this model
9:52 from 11 Labs is able to generate
9:54 keywords so let's see how it works in
9:57 the platform I'm putting the phrase that
10:01 I want to have and it will generate me several
10:08 samples techx London
10:12 2025 techx London
10:14 2025 techx London
10:17 2025 so now I can generate a keyword
10:18 spotting data set without having to
10:20 touch microphone that's pretty amazing
10:22 these models are already very good
10:24 another one imagine I want to build a
10:31 I can generate these sounds without
10:32 having to break glass I don't want to
10:34 break anything I want to build a model
10:36 that will detect it for me so that's
10:38 that's pretty great right like it's
10:40 incredible that we can do these things
10:43 without having to to you know it's very
10:44 fast and very large amounts of data that
10:47 we can generate with um the lowest uh
10:50 investment right but actually everything
10:52 starts with label data because if we
10:54 have a bunch of data that we collected
10:55 from the field but it's not label it's
10:57 not quality data it's going to do us
10:59 damage rather than help us build
11:01 solution so quality control as we say
11:04 garbage in garbage out right
11:08 so again llms right it's crazy that now
11:10 I can put an image to in this case the
11:14 chat GPT or any other llm uh model ask
11:15 it what's going on it's not going to
11:18 tell me there's an person wearing orange
11:20 it's going to tell me which was taken
11:22 actually at the Olympics that this is a
11:23 Dutch national rowing team that is
11:26 getting a medal which is already a step
11:28 further to any sort of like automated
11:30 process I could do right that's great
11:33 but it's it's not applicable directly
11:35 for iot use cases because it's very
11:38 expensive right these models are huge
11:39 they're not going to fit anywhere
11:42 they're very expensive to even compute
11:43 and going back to the point you have to
11:45 send all those images up so the costs
11:47 are growing exponentially so that's not
11:50 really feasible now what we can do again
11:52 is to use these models to help us
11:54 quality control the data we already have
11:56 right like if I'm if I'm building um a
11:59 use case with a lot of um images I'm I'm
12:01 going to go and collect a video that
12:03 will have the things of interest for me
12:05 but then I would have to label every
12:07 frame of the video manually which is a
12:09 very labor intensive task now what I can
12:14 do is to give all my data set to this uh
12:15 large language model and this is how it
12:17 looks in our platform and ask it look at
12:19 every image you have out of all those
12:22 three 4,000 and tell me if there is a
12:26 hard hat or no and in 5 minutes I'm
12:28 going to have 5,000 labels applied
12:31 that's pretty incredible and increasing
12:33 the productivity with which we can now
12:34 approach these
12:38 Solutions so what we have so far is that
12:39 we have these Foundation models big
12:41 models llms that are incredibly smart
12:44 and they're very general um and they can
12:48 assist us to to build solutions for the
12:50 edge which don't have to be that smart
12:52 because I'm building this model to build
12:54 to solve one particular problem at one
12:57 at my plant or in my product I don't
13:00 need it to know everything in the world
13:03 but sometimes it's just not enough right
13:05 so like how do we how else can we be
13:09 smart and combine um them together we
13:10 need to take a step back again and
13:11 actually realize that everything starts
13:13 with Hardware iot is Hardware right in
13:17 the end so what the slide that I had
13:20 before with um the the iot is
13:23 connectivity and um hardware and
13:25 everything before we were trying to
13:26 develop the Technologies on top like
13:28 connectivity but the hardware was
13:30 limited we could only run the control
13:31 logic there and sensing and sending
13:33 everything up but now Hardware has
13:35 caught up we have so many possibilities
13:37 to put extra intelligence in the same
13:40 chips that we have and extract value
13:42 from the sensors that we're sampling so
13:44 what how we like to say is that Aji
13:47 Hardware 3 plus1 layers 3+ one because
13:50 these three are actually iot still and
13:52 GPU you could say somewhere in between
13:55 it can be an embeded GPU but 3+ one mcus
13:57 and mcus with npus which is neuro
14:00 processing units which is dedicated A6
14:02 design for machine learning inflence
14:04 mpus which is typically Linux systems
14:06 and then gpus and AI accelerators which
14:10 are high-end edml Hardware so let's take
14:13 a look what happened in 2022 a company
14:16 little company called arm have developed
14:19 something that is called ethos arm are
14:21 known for licensing general purpose
14:24 Computing cores but they said Hey how do
14:27 we make a GPU for embedded systems in
14:30 quotes right so they made this
14:32 very very energy efficient and high
14:34 performant piece of Hardware that is
14:36 able to process machine learning
14:39 operations very quickly and because arm
14:41 did it now everyone who works with arm
14:42 can just take it and apply it to their
14:44 designs incredible now we have a lot of
14:46 companies like Al if and ninian who have
14:48 Solutions with socks that have a
14:51 dedicated npu on their Hardware licensed from
14:52 from
14:55 arm but then companies like St started
14:58 looking and said hey I want to do it
15:00 myself and I can do it better than
15:03 licensing a thing from arm so you see
15:05 that yellow uh Corner SD has announced
15:07 like a couple months ago so they develop
15:12 this STM 32 n6 you can uh see this kit
15:15 in our booth 152 we have a this you can
15:17 you can look at it but basically now
15:18 they develop their own silicon
15:21 accelerator for 600 ghs which means that
15:23 MCU grade device MCU grade power
15:26 consumption you can run um YOLO V8 at 15
15:30 FPS it's incredible second mid-end
15:32 computer vision you have things like nxp
15:35 who used arm also dedicated acceleration
15:37 and Renaissance who build their own IP
15:39 amazing this you can already connect
15:41 several cameras and have a Linux system
15:43 and way more complex and then you have
15:45 high-end computer vision system this is
15:48 where you can run 10 models at the same
15:51 time 10 camera streams 60fps and put it
15:52 in robotics applications things like
15:54 that this is this Hardware now you can
15:57 just grab everything and make things run
15:58 locally that you could never imagine before
16:00 before
16:03 so we can be a little bit even more
16:07 smart and say hey how about um I run a
16:09 very simple model it's called an anomaly
16:11 detection that says is there something I
16:13 don't know in the frame yes or no is
16:14 there something that's not supposed to
16:16 be on the floor this model will tell you
16:18 yes or no but as soon as it
16:22 happens I can only then send this image
16:26 to GPT or with the last level of devices
16:29 to an on device llm or VM and get an
16:32 Insight of what's actually happening so
16:34 this kind of solution I see in in in the
16:35 coming year be becoming way more
16:37 prominent and interesting because the
16:39 hardware allow us to do these things
16:41 which is I think my my my personal
16:43 highlight of the coming
16:46 year and this is a small demo of it in
16:49 action there's an anomal detection model
16:50 running and you see the Cascade is not
16:53 enabled so we're not sending anything up
16:55 we're not sending anything to chat GPT
16:58 but now I'm sending only these frames if
17:00 I were to do for every frame I would
17:01 spend all that money that I was talking
17:05 about now it's out there's no anomalies
17:06 I'm not sending anything it's running
17:08 just on the
17:10 device and now a signature proper V imp
17:12 plus a bottle of
17:16 beer beer bottles yeah so I can balance
17:18 kind of if I know that this one is not
17:20 going to happen very often I can still
17:22 offload it and and you know use the the
17:25 bigger computational expensive
17:27 things right so just to reiterate I
17:29 think all these points now is the time
17:31 when a hardware and software are at the
17:33 point where we can in the same Solutions
17:36 distributed low power iot devices get
17:37 way more intelligence from the data
17:40 we're already collecting and
17:44 having last question I had a lot what is
17:53 Technologies I think it's always I like
17:55 I like showing this this is this is a
17:57 great thing because we are all
17:58 developers in one way or the other we
18:02 all know developers who we as managers
18:03 will come to and say I want this but you
18:05 need a developer who will take it
18:07 implement it test it and do everything
18:09 you want for it so that's why we at Ed
18:11 impul the platform that you've seen
18:14 before we have um made it so that you
18:15 can access it for free you can use it
18:17 for free you can access most of the
18:19 features for free you can even use the
18:21 outputs of the platform such as models
18:23 and use cases in commercial applications
18:27 for free because if this technolog is
18:29 getting adapted by everyone and everyone
18:30 is speaking the same language and knows
18:33 how to which Hardware to use which use
18:35 cases to build this makes us all way
18:37 more productive and coming up with
18:39 Incredible new new things that that can help
18:40 help
18:43 us so another QR code those who missed
18:45 the last two this is your last chance no
18:47 actually not the last one but yeah sign
18:50 up uh immediate platform access you can
18:52 you know do all this talk to
18:56 us talk to us at our booth and one thing
18:59 on the left is also an actual book that
19:01 we have a few left in the on the booth I
19:04 think it's a book about Aji it's about a
19:05 lot of Concepts that are not in the
19:08 scope of this conversation but it gives
19:09 you the fundamental understanding of how
19:13 sensor applications are approached when
19:14 you want to apply machine learning to
19:16 them this is a very interesting table
19:18 read this is not marketing about oh how
19:20 to use aul it's really fundamental
19:22 principles of these Technologies and
19:25 also a lot of um content that we've
19:27 created about these interesting things
19:29 so that's all for me