0:02 over the next 40 minutes I'm going to
0:03 present to you everything you need to
0:06 know about Microsoft fabric Microsoft's
0:09 new endtoend data and analytics platform
0:11 the key learning outcomes for this
0:14 course are that I want you to walk away
0:16 with a solid understanding of what
0:19 fabric is the problem it solves and how
0:21 it solves them but more than just the
0:24 boring what why and how of fabric really
0:26 what I want to share with you is some of
0:29 my passion and excitement for Microsoft
0:31 fabric so you leave with an appreciation
0:34 for the huge opportunity that fabric
0:36 represents for your organization and for
0:39 you personally in your own career we'll
0:41 Begin by telling the story of Houston
0:43 Electric a fictional electrical Goods
0:45 e-commerce company that is struggling
0:47 with its existing data infrastructure
0:50 and workflows through the case study we
0:51 will explore the problems that fabric
0:53 was built to solve we will then dive
0:56 into the big Vision behind fabric I'll
0:57 be showing you exactly how Microsoft
0:59 fabric solves these typical indust
1:02 industry problems and you begin to see
1:03 why I believe fabric is such
1:05 transformational technology now if you
1:07 want to truly understand fabric then
1:09 there are some fundamental concepts that
1:12 you will need to learn so in part two of
1:14 the course we will cover what I believe
1:16 are the most important Concepts that you
1:18 need to understand those being one Lake
1:21 the seven experiences of fabric and the
1:23 four compute engines together we will
1:25 then look in more detail at the seven
1:28 experiences of fabric so that you gain a
1:30 proper appreciation for what exists
1:32 and how you can use it to round off this
1:34 course I'll be showing you how you can
1:36 get started using fabric for yourself
1:38 and signpost some of my favorite
1:40 resources for learning fabric that I
1:41 think can really help you in your own
1:43 Journey so if you're wondering is this
1:45 course really for me should I invest the
1:48 time to sit through this course well the
1:50 great thing about fabric is that it's an
1:53 endtoend unified solution which means
1:56 that it caters for every data Persona in
1:58 your organization so if you're working
2:00 in any of these roles currently maybe
2:03 outside of fabric then this presentation
2:04 will definitely be highly relevant and
2:07 interesting for you or if you want to be
2:08 working in one of these roles within
2:10 Fabric in the future then this course
2:12 will also be a great foundation for you
2:15 either way no knowledge of fabric is
2:17 assumed there's not going to be any code
2:19 examples or too technical in the weed
2:22 stuff this is quite a high level course
2:24 if you want a more detailed explanation
2:27 and code examples feel free to look at
2:28 my other videos on YouTube where there's
2:31 lots of that stuff on there and I'll be
2:33 using plenty of practical examples
2:35 throughout to solidify your
2:37 understanding of the platform I've spent
2:39 over 200 hours preparing this course so
2:40 I believe it would definitely be worth
2:42 your while to stick around on top of
2:44 that I've been using fabric nearly every
2:46 day since it was released so I've got a
2:49 good understanding of how it works and
2:50 I've been a data professional for the
2:52 last eight plus years working across
2:55 pobi data engineering data science data
2:58 warehousing and realtime analytics which
3:00 you're about to see covers nearly every
3:02 aspect of Fabric's functionality so all
3:04 that effort knowledge and experience has
3:06 been condensed into this introduction to
3:08 fabric course which I'm releasing for
3:10 free on this YouTube channel to give
3:12 back to the community that has helped me
3:14 so much in my own Journey as well as on
3:17 YouTube this course is hosted for free
3:20 in my online community built especially
3:22 for people like you to learn Microsoft
3:24 fabric even faster I recommend you
3:26 follow the link in the description
3:28 because as well as the video lessons you
3:30 also get access to the course notes and
3:33 links to further resources that I'll be
3:34 mentioning throughout this video
3:36 throughout the course and you can use
3:38 the community feature to ask me and the
3:40 community any questions you might have
3:43 about the course and Microsoft fabric
3:45 generally and it's completely free so
3:46 there's nothing to lose so without
3:49 further Ado let's begin so to understand
3:51 why fabric is such a transformative
3:53 technology it helps to properly
3:55 understand the problems that fabric has
3:57 been built to solve and to illustrate
3:59 this I want to tell you about the story
4:01 of Houston Electric now Houston
4:04 electrics is a fictional electrical
4:07 goods online retailer based in the US
4:09 but it ships electrical parts around the
4:12 world the company's grown to over 400
4:14 employees to service a growing demand
4:16 and the company Prides itself and being
4:18 Innovative and uses data to make better
4:21 decisions in 2015 Houston electrics
4:23 hired their first Chief data officer who
4:25 launched a digital transformation
4:28 program to make better use of technology
4:30 to drive business growth Central to this
4:31 strategy was the establishment of a
4:33 central data department now with more
4:35 than 50 employees and the company has
4:37 invested in a number of cloud
4:39 Technologies in Azure and Amazon web
4:41 services and they've started using
4:43 powerbi looking at what the company is
4:46 using today we can see that the compan
4:48 is split into the following main
4:50 departments at least from a technical
4:52 point of view each department has a
4:54 number of data technologies that they
4:56 use to store and manage and analyze
4:58 their data and if you're not familiar
5:00 here with all of the logos on the screen
5:01 don't worry the point here is that the
5:04 data landscape in the company has grown
5:06 organically over the last 8 years with
5:09 little strategic Vision this is a common
5:11 situation that a lot of companies find
5:13 themselves in each department uses the
5:15 tools they prefer and each department
5:17 relies on data from other departments to
5:19 do their job for example the customer
5:21 success team manages the order reviews
5:23 database which keeps track of all the
5:25 reviews that customers leave about their
5:27 products now let's look at a particular
5:30 workow that uses this you orders review
5:32 database to see how it works and see how
5:34 the company functions so when a customer
5:37 purchases a product they leave a review
5:39 on the website and the review data is
5:42 stored in an Azure SQL database managed
5:44 by the customer success team and the
5:46 data engineering team and the company
5:48 have built a data pipeline that copies
5:51 this data every morning into a data Lake
5:53 for data scientists to analyze and the
5:56 data scientists then take this review
5:58 data they perform sentiment analysis on
6:01 the review basically to gauge whether
6:03 the customers are liking or hating
6:05 certain products based on the text they
6:07 write and these sentiments are then
6:10 written into a data Lake container
6:12 finally the bi team have created a
6:15 powerbi report that communicates these
6:18 findings back to the product teams to
6:19 help them improve the products going
6:21 forward now they've got this workflow to
6:23 a point where it's it's working but it
6:25 was a pain to set up and it's a pain to
6:27 maintain and data just for this one
6:29 workflow is scattered across four
6:31 different locations in four different
6:34 data formats some of which are
6:35 proprietary meaning you can't really
6:38 access it very easily from other
6:41 tools and this is just one workflow the
6:43 company operates many data processing
6:46 pipelines like this copying data between
6:48 different departments and between
6:50 different storage locations and products
6:52 the complexity of the data landscape has
6:54 grown so much that the maintenance of
6:57 these existing systems and processes
6:59 becomes an almost full-time job so to
7:01 analyze what might be going wrong here
7:03 let's map out some of the tools that the
7:06 company is currently using under these
7:10 four broad buckets of data ingestion
7:12 data storage data engineering and data
7:15 science and business intelligence now
7:17 let's analyze the current situation
7:20 against these nine considerations listed
7:22 down the left hand side and to analyze
7:24 them we're going to speak to a number of
7:26 key employees in the company so if you
7:28 speak to the chief data officer they're
7:31 going to tell you that the data
7:33 architecture has grown organically and
7:35 now there are data silos all over the
7:36 place I'm the chief data officer and I
7:39 don't want to be the chief integration
7:41 officer so instead of focusing his time
7:44 on how the company can generate more
7:46 value from data here's role is focused
7:48 on system integration which is quite a
7:50 massive headache for him and for his
7:52 team and for the company so in our
7:54 visual map here we can plot different
7:56 blocks that represent all the different
7:59 systems that create different data silos
8:00 then then if we speak to the data
8:03 engineering lead he tells us that we're
8:05 maintaining hundreds of pipelines
8:07 copying data between lots of different
8:09 data stores for different departments
8:11 and it's messy and to get around all
8:14 these silos of data within and between
8:16 departments and data products the data
8:18 engineering team have been flat out
8:21 building data pipelines that copy data
8:22 from here to there and now they're in to
8:25 deep the pipelines are failing regularly
8:26 as there's just too much to manage and
8:28 maintain within a small team so on our
8:30 diagram now we can see all of the
8:33 different copies between each system and
8:34 product and how this is causing
8:38 confusion and maintenance and problems
8:40 with data quality as well it's not just
8:41 the data engineering team this causes
8:43 issues for imagine being a powerbi
8:45 developer how on Earth would you know
8:47 which data sets to connect to with an
8:49 organization and how can you know if
8:51 they're up to dat and whether you can
8:53 trust them the data science team faces
8:56 similar issues they mention that we have
8:58 data scattered in so many formats in
8:59 different places
9:01 and it takes me days to get clean data
9:04 sets just to begin an analysis and even
9:05 then I don't know if I can trust the
9:07 data and on top of that just the fact
9:10 that so many systems exist and they're
9:12 all different creates a cognitive
9:14 overhead for this team they added that I
9:16 had to learn the intricacies of many
9:18 different data Technologies and each one
9:20 is different we had a new starter and it
9:23 took three months to upskill them in all
9:25 the different data platforms we use so
9:28 let's add those problems onto our visual
9:30 here the data scientists mentioned that
9:32 there's no uniformity in data formats
9:34 across the organization which makes it
9:35 difficult to get everything into a
9:38 common format needed to begin analysis
9:40 on top of that they have to work every
9:42 day across five six maybe seven data
9:44 products each with their own learning
9:47 curve user interface user experience and
9:49 this is inefficient especially for new
9:51 joiners so the different color blocks
9:54 here show how the user experience is
9:56 different in each of the tools the
9:58 company uses next the IT director is not
10:00 happy too he's saying we're using too
10:03 many systems all have different security
10:05 profiles and requirements to keep data
10:08 at rest and data in transit secure and
10:10 it's a nightmare and it's true it's not
10:11 just the people on the front end of
10:13 these tools that find it difficult spare
10:15 thought for the IT director and his team
10:18 responsible for managing access to each
10:21 of these tools independently securing
10:23 them and the data stored within them
10:25 governing the data and then monitoring
10:27 and maintaining the systems generally
10:29 and as if that wasn't bad enough enough
10:31 he also gets the finance director on the
10:33 phone every month to complain about the
10:35 Azure Bill he added I dread getting our
10:37 Azure bill every month it's so
10:39 unpredictable and sometimes scary each
10:41 data product has their own pricing
10:43 structure so it's difficult to predict
10:44 how much we will be charged month to
10:47 month so now we add billing and
10:49 Licensing to our visual here and we can
10:50 see that each product has a different
10:52 licensing and a different billing
10:54 structure which makes it really
10:55 difficult for companies to manage and
10:58 finally it's the poor old dashboard user
11:00 they don't really know much about all
11:01 the palava that's going on behind the
11:03 scenes here but they can sense that
11:05 something isn't right they say I'm not
11:07 sure I can trust the data that's being
11:09 presented to me here doesn't always
11:11 reflect reality and they're right to be
11:14 nervous all of this complexity means
11:16 that the data teams have failed to
11:18 really get a grip on data governance and
11:20 data quality and the dashboard users at
11:23 the end of the process are losing trust
11:24 and when they lose trust in the
11:26 visualizations and the data being
11:27 presented to them then what's the point
11:29 in all of this anyway so let's just
11:32 pause there for a second and it was at
11:35 this point when the head of Houston
11:38 electrics the smart guy that he is was
11:40 watching the learn Microsoft fabric
11:42 YouTube channel wow what a coincidence
11:45 he learned about Microsoft Fabric and he
11:47 knew that Microsoft fabric could help
11:48 solve the problems that they were
11:50 experiencing as a business and he was
11:53 right you see within Microsoft fabric we
11:56 have a family of data products available
11:59 to us which Loosely fall under the four
12:00 buckets that we were looking at before
12:03 data ingestion storage data engineering
12:05 data science and business intelligence
12:08 now these tools when looked at in
12:10 isolation they serve a similar function
12:12 to the tools that the company was using
12:14 previously and it's a common myth that
12:16 you sometimes hear people say about
12:19 Microsoft fabric is that oh it's just a
12:21 remarketing exercise they've just put a
12:24 new badge on existing technology but
12:25 this Viewpoint fundamentally
12:27 misunderstands that fabric has been
12:30 completely built from the ground up to
12:33 address all of the problems that we saw
12:35 in the previous slides it's not just a
12:37 marketing exercise but a complete
12:40 rethink and re architecture of how data
12:42 is managed in your organization and
12:44 let's look at what we mean by that in
12:47 more detail so in fabric all of your
12:50 company data resides in one place and
12:52 it's called one Lake and it's a
12:54 fundamental concept that we'll explore
12:57 in more detail a little bit later on how
13:00 one Lake eradicates data silos so data
13:03 across all these different products is
13:05 actually stored in one place and because
13:08 of this it also eradicates the need to
13:10 create multiple copies of a data set a
13:13 fundamental principle in fabric is that
13:16 a data set should only ever exist in one
13:18 place and then be referenced throughout
13:20 fabric using a clever feature called
13:23 shortcuts so all your organization's
13:26 data is stored in one Lake and in one
13:29 Lake all data is stored in the same
13:32 format called Delta paret and this is an
13:35 Open Standards format and this solves so
13:36 many problems that we highlighted
13:39 previously most importantly it solves
13:42 the data integration problem so data
13:44 scientists data engineers and data
13:47 analysts might all be using different
13:49 tools maybe different languages that
13:51 they're familiar with but under the hood
13:53 they're all working on the same data
13:55 which is in the same format and they're
13:57 not wasting hours and days trying to get
14:00 that data into suitable format so they
14:02 can begin their analysis next fabric
14:05 provides a unified user experience we
14:07 heard previously that people were sick
14:09 of having to log into multiple different
14:11 platforms each with a different look and
14:13 feel when in fabric you log in Via a web
14:15 portal and the user experience is
14:18 designed to feel similar to that of
14:21 Microsoft 365 so you log in once and you
14:23 can navigate to any fabric experience
14:25 and each experience has a similar look
14:27 and feel this means you can spend more
14:28 time thinking about how you're going to
14:30 get value from your data and not
14:32 worrying about how to navigate through
14:34 the application and for people
14:36 administering fabric there's so many
14:39 benefits too for starters access control
14:42 and security is drastically simplified
14:43 in fabric there's one access control
14:46 method and one security model applied
14:48 across all tooling and experiences and
14:51 access to these resources is principally
14:53 managed through workspaces which is just
14:56 a collection of fabric items that makes
14:58 sense for a particular security boundary
14:59 that you want to set up and because all
15:02 of our data lives in one Lake data
15:04 governance and discoverability becomes a
15:07 much easier task and in fact fabric has
15:09 many features built in for governing
15:11 your data set next up is a really
15:14 important one fabric comes with a single
15:16 built-in monitoring Hub which monitors
15:18 all fabric activity across your
15:21 experiences so you only have one place
15:22 to look if you want to monitor all of
15:25 your different data pipelines your
15:28 notebook runs any kind of processes that
15:30 you're running within Fabric and last
15:32 but not least billing and Licensing in
15:36 fabric is Unified meaning that when you
15:38 purchase some fabric capacity for your
15:40 organization you immediately get access
15:42 to all of the features and items there's
15:45 no longer a separate license or billing
15:46 structure for all of the different
15:48 products listed above now there's
15:50 different levels of capacity you can buy
15:53 depending on the intensity of your usage
15:56 and something to bear in mind here is
15:57 that if your company is currently paying
16:00 for power bi premium capacity you get an
16:03 f64 fabric capacity which is a lot of
16:06 capacity in fabric for free with your
16:08 powerbi premium capacity this means you
16:10 can create and use any fabric items
16:13 today without paying any extra and all
16:15 of these things I've mentioned here are
16:17 kind of a topic within themselves that I
16:19 could go into a lot more detail on but
16:21 what I've done is if you go through to
16:23 the Community Link here and go through
16:25 to the classroom tab I'll leave a link
16:27 in the description there's links to a
16:28 lot more resources where you can learn
16:30 about each of these specific features in
16:33 Fabric in more detail okay so now we're
16:35 starting to understand the power of
16:37 fabric and you've heard about one Lake
16:40 and the experiences and the compute
16:42 engines and how powerful these things
16:44 are when they work together let's look
16:46 at how it works in more detail so in
16:49 fabric there are seven experiences and
16:51 they cover the whole endtoend data analy
16:54 workflow that a company might face now
16:56 an experience is just a logical grouping
16:59 of tools that make sense for specific
17:00 personas for example in the data
17:02 engineering experience you'll find easy
17:05 access to tools that a data engineer
17:07 would use frequently and in fabric
17:09 you'll find three main data stores
17:11 places where you can create and manage
17:14 data the data warehouse the lakeh house
17:16 and the kql database the important point
17:19 is that any tabular data that you create
17:22 in any of these stores is under the hood
17:25 automatically stored in one Lake and
17:27 crucially all the data stored in one
17:30 lake is is stored in this format the
17:32 Delta paret format which as I said
17:34 previously Is An Open Standards format
17:37 so to interact with data in any of these
17:40 stores we can write tsql scripts in the
17:43 data warehouse we can write python r or
17:46 Scala in the notebooks in data
17:48 engineering and data science experiences
17:51 or we can write kql in the real-time
17:54 analytics experience and there's also a
17:58 lot of low code and no code options for
18:00 beginners in each of these experiences
18:03 and Microsoft co-pilot is now tightly
18:05 integrated into fabric so you don't need
18:07 to worry if you don't know much coding
18:10 fabric is really built to cater for
18:13 people with no coding experience all the
18:15 way through to professional developers
18:16 and these are features that they're
18:18 constantly adding more of into fabric to
18:21 make it a bit less intimidating for you
18:23 know beginner level people entry-level
18:25 people to get up and running quickly in
18:28 fabric now the ability to run script in
18:30 many different languages against the
18:32 underlying Delta paret format of one
18:36 lake is made possible by the four
18:38 compute engines in fabric now these
18:41 compute engines act like the integrator
18:43 the user writes some code in a language
18:45 they're familiar with and then the
18:47 relevant engine converts that into a
18:50 query of the underlying Delta tables in
18:53 one Lake and Returns the data to the
18:55 user that they're expecting in the
18:57 experience that they're currently using
19:00 now once the data is stored in one Lake
19:02 it's directly accessible by all the
19:04 other engines without needing any import
19:07 or export and all the compute engines
19:09 have been fully optimized to work with
19:11 Delta paret as their native format as
19:14 well as data stores and scripting tools
19:16 there are more items available a lot
19:18 more items available to us in each
19:20 fabric experience so now we have a good
19:22 understanding of how the overall
19:24 architecture Works let's review each
19:27 experience one by one starting with data
19:29 Factory so the data Factory experience
19:32 in fabric is primarily focused on moving
19:34 and transforming your data and a core
19:37 use case is using data Factory to get
19:39 new data into fabric perhaps from an
19:42 external API or by connecting to one of
19:44 your organizational systems you could
19:46 describe the data Factory experience as
19:49 a set of tools to help you with extract
19:51 transform and load functions now it's
19:53 built for Enterprise scale too so if you
19:56 have a lot of data or a high frequency
19:57 of refresh then this shouldn't be a
19:59 problem now the fabric items that you
20:02 can create within this experience are
20:04 the data flow and the data flow
20:06 basically allows citizen developers to
20:08 connect to more than 300 data sources to
20:11 bring data into fabric transform it
20:13 using a kind of familiar low code no
20:16 code power query interface so if you're
20:18 a powerbi developer or you've used Excel
20:20 then this will be very familiar to you
20:23 and the data can then be written into
20:26 one of the fabric data storage solutions
20:29 like a Lakehouse or a data wouse we also
20:30 have the data pipeline now this is an
20:33 orchestration tool and it's used to
20:35 trigger different data processing
20:38 workflows normally on a schedule for
20:40 example triggering the Run of a fabric
20:43 Notebook on a particular schedule or
20:46 maybe triggering a stored procedure in
20:49 your data warehouse and pipelines can be
20:52 used also to bring data into fabric this
20:53 set of tools provides similar
20:55 functionality to existing tools that you
20:57 might be familiar with like azid data
21:00 fact Factory synapse pipelines and
21:03 powerbi data flow gen one and the main
21:05 personas who might be using the
21:07 datafactory tools would be data
21:10 Engineers analytics engineers and also
21:12 powerbi developers now the data
21:14 warehouse experience consists of the
21:16 data warehouse unsurprisingly which
21:18 provides a familiar transactional data
21:20 warehouse solution with tables schemas
21:23 views stored procedures and all that
21:25 good stuff and it's obviously queriable
21:27 using tsql and it provides features to
21:29 make it more accessible for citizen
21:33 analysts so there's visual scripting low
21:35 and no code Solutions built into this
21:38 experience as well and it's Lake Centric
21:39 meaning that under the hood it's highly
21:41 scalable architecture so it's not a
21:44 traditional SQL Server although you can
21:48 use tsql to query it under the hood it's
21:50 actually built on top of a completely
21:53 different engine called the pois engine
21:54 now the fabric items you can create
21:56 within this experience are as we
21:58 mentioned a trans action or data
22:01 warehouse built on top of the Polaris
22:03 engine for scalability so this is where
22:06 users can create tables schemas views
22:08 store procedures functions all this good
22:10 stuff now it provides a similar set of
22:12 functionality although not exactly the
22:14 same to some of these existing tools so
22:16 it's similar in a way to SQL server or
22:19 an azra SQL database because it uses
22:21 tsql it allows the user to interact with
22:24 it using tql but it's more similar to
22:27 synapse SQL serus or dedicated pools
22:29 because it us is the same underlying
22:31 engine the Polaris engine and if you're
22:33 using tools outside of the Microsoft
22:36 ecosystem similar to snowflake I mean
22:37 obviously there's lots of other tools I
22:39 could mentioned here that similar to but
22:42 these are just a selection and the main
22:44 personas who might be using the data
22:46 warehouse experience could be database
22:48 administrators data Engineers data
22:51 analysts those kind of people next up we
22:54 have the data engineering experience and
22:56 data Engineering in Microsoft fabric
22:58 enables users to design build build and
23:01 maintain infrastructures and systems
23:02 that enable their organization to
23:06 collect store process and analyze large
23:08 volumes of data so the fabric items that
23:10 you can create in this experience are
23:12 the Lakehouse which allows you to store
23:15 and manage both unstructured data so
23:18 files and convert them into structured
23:21 data so Lakehouse tables and we do that
23:24 using notebooks and other tools as well
23:27 but the notebook is where a user can
23:30 write write and run scripts in a variety
23:33 of languages to perform data engineering
23:36 tasks like cleaning data or validating
23:38 your data or whatever you like really
23:41 and the languages available to you are
23:44 python or r or Scala and the notebook is
23:47 built on top of Apache spark which is a
23:49 big data processing framework commonly
23:52 used in the data industry now we also
23:54 have the spark job definition and this
23:56 is a set of instructions typically
23:59 written in Python in P spark that Define
24:02 how to execute a job on the spark
24:04 cluster and this is more for advanced
24:06 users who want a bit more control over
24:09 how their data is going to be processed
24:11 by The Spark engine now this set of
24:14 tools provides similarish functionality
24:16 to some of these tools that you might be
24:19 familiar with like ADLs Azure data Lake
24:22 Services data bricks or Snowflake and
24:24 obviously the main personas who might be
24:27 using the data engineering experience
24:30 would be data Engineers or analytics
24:31 Engineers next up we have the data
24:35 science experience so the data science
24:37 experience provides a complete set of
24:40 tools to support the entire data science
24:42 workflow within an organization right
24:45 the way through from data exploration
24:47 preparation cleansing and
24:49 experimentation modeling model scoring
24:51 and the serving of your insights and
24:53 your predictive insights within a
24:55 powerbi report now the fabric items you
24:58 can create in this experience are are
25:00 the notebook so the notebook is a key
25:02 tool for data scientists and they use
25:05 them to explore the data through code
25:08 typically through python or R and the
25:11 notebooks are used for data exploration
25:14 running experiments training ml models
25:16 machine learning models and other stuff
25:18 that data Sciences do as well another
25:21 tool in this experience are experiments
25:23 so when a data scientist is training a
25:25 machine learning model typically they
25:28 will run a lot of experiments to
25:30 optimize the model and the experiments
25:33 item in fabric provides functionality to
25:36 track each of these different iterations
25:38 and logs things like the parameters that
25:41 are being used the code versioning and
25:44 evaluation metrics for that particular
25:48 model run now experiments use mlflow
25:50 which is basically the industry standard
25:52 for this kind of logging of machine
25:53 learning model training and
25:55 experimentation finally we have machine
25:57 learning models so during the experiment
26:00 ation phase specific versions of machine
26:03 learning models can be registered using
26:05 ml flow and fabric provides
26:07 functionality for managing and reviewing
26:09 these models using the machine learning
26:12 model item in Fabric and you'll find
26:14 similar functionality within here that
26:17 you might find in other tools like Azure
26:19 machine learning synapse notebooks and
26:21 datab Bricks notebooks as well and
26:22 finally the main personas that going to
26:25 be using the data science experience is
26:27 not surprisingly data scientists the
26:30 real time analytics experience in fabric
26:33 provides a set of tools to ingest manage
26:36 and analyze real time invent data and
26:38 event data is a completely different
26:40 Paradigm really in data analytics and
26:43 requires a different mindset and also a
26:44 different set of tooling the fabric
26:46 items you can create within this
26:50 experience are the kql database which is
26:52 a data store for your streaming data
26:55 sets it's built on top of the kql the
26:57 custo query language engine and and if
27:00 you're familiar with Azure data Explorer
27:03 for things like logging then this is the
27:06 same engine that being used in the kql
27:08 database within fabric we also have
27:11 event streams and this is a no code tool
27:14 to register streaming data sets process
27:18 them and then rout them to various
27:20 destinations in fabric we also have the
27:23 kql query set now this basically allows
27:27 you to query data in a kql database
27:29 using the custa query language and this
27:30 set of tools provides a similar
27:33 functionality to the existing tools as I
27:35 mentioned it's very similar to Azure
27:38 data Explorer now there are some new
27:39 things like the event stream that's
27:41 completely new but it's built on top of
27:44 the same engine and the personas well it
27:46 could be a DAT engineer or an analytics
27:49 engineer or if you have real time or iot
27:52 Internet of thing Engineers or perhaps
27:54 security Engineers because this is used
27:57 quite a lot with security event logging
27:59 and tracking that sort of thing as well
28:02 so powerbi is Microsoft's business
28:04 intelligence solution that allows you to
28:06 create reports to present visual
28:08 insights to business users within fabric
28:11 there's quite a lot of items that you
28:13 can create under this experience I'm
28:14 going to be talking about two of the
28:18 main ones first one being the report so
28:20 the report is basically a business
28:23 facing business intelligence report and
28:25 you use it to visualize business data
28:27 and insights and show them back to to
28:30 your user your business and when you
28:32 publish a report you also get an
28:35 Associated semantic model and these are
28:37 what used to be called a data set in
28:39 powerbi until not too long ago and the
28:41 semantic model is a collective term that
28:44 refers to all the things that make up
28:46 kind of like the back end of a powerbi
28:48 report so things like your tables your
28:51 relationships your Dax measures now
28:53 things like calculation groups as well
28:54 all these different things make up the
28:57 semantic model this experience provides
28:59 a set set of tools that you know similar
29:02 to maybe you've used Tableau or looker
29:04 obviously it's not exactly the same
29:06 there's lots of differences but it's
29:07 these kind of business intelligence
29:09 tools if you're coming from a different
29:12 platform and the main users and personas
29:14 who might be using this experience will
29:17 be well the business users you know not
29:19 necessarily data professionals but
29:23 people who are consumers of your reports
29:25 then you have powerbi developers and
29:27 maybe bi analysts data analysts
29:30 all going to be using powerbi now the
29:32 final experience I want to touch on here
29:36 is data activator and data activator is
29:38 currently in preview still and it's a
29:40 very new experience but it's a no code
29:43 experience in fabric for automatically
29:47 taking actions for example running a
29:50 power automate script when patterns or
29:53 conditions are detected in changing data
29:55 and this could be data in a powerbi
29:57 report or it could be event streams as
29:59 we've just seen in the real time
30:02 experience and in data activator you can
30:05 create what's called a reflex now a
30:08 reflex lets you define a specific data
30:10 point that you're tracking maybe it's
30:13 the temperature of a fridge in an iot
30:15 system for example and then it also
30:18 allows you to define a condition so
30:21 whenever the temperature of your fridge
30:26 gets greater than 4° C or whatever that
30:27 is in Fahrenheit it's going to trigger
30:30 an action and your action can be a
30:31 variety of different kind of actions it
30:33 could be a power automate script that
30:35 runs or it could be an email
30:38 notification or it could be a HTTP
30:42 request to some other Azure function for
30:45 example or all these different options
30:47 but basically it monitors your data in
30:51 real time and then when the data reaches
30:53 some Condition it's going to perform an
30:55 action data activator i' would say is
30:57 quite unique in the data analytics
30:59 Market I don't think there's too many
31:01 companies doing a similar thing at least
31:03 at this scale so you might have used
31:07 tools like power automate or Azure
31:09 functions which has a similar kind of If
31:11 This Then That Type Paradigm but
31:12 obviously in Azure functions you have to
31:15 create that logic yourself most of the
31:17 time and power automate is not really
31:19 built for Enterprise scale data
31:21 processing but these are some of the
31:23 tools that you could describe it as
31:25 similar to although it is unique in its
31:26 own way and the people who going to be
31:28 using this
31:30 tool are again business users because
31:33 this is an entirely no code experience
31:35 so it's built for people in the business
31:38 to set up these things without you
31:40 having to do it so if you've got someone
31:43 in marketing who really wants to track
31:45 you know social media Impressions and
31:47 when they get above a certain level they
31:49 want to be notified you know if one of
31:52 your posts is going viral you don't want
31:55 to be sitting reading the powerbi report
31:57 day after day hour after hour really you
31:59 you want to set a threshold and once it
32:02 goes above a th000 Impressions then you
32:04 want to be alerted to that fact and then
32:07 you can do something about it so that's
32:09 why data activator is so powerful
32:10 because people in the business can set
32:13 these up for themselves in a no code way
32:15 so we've covered a lot of ground there's
32:17 probably information overload going on
32:20 inside your heads so let's just pause
32:22 here for today but before we finish I
32:24 just want to let you know how you can
32:27 get started using Fabric and then show
32:29 you lots of links to more resources that
32:31 I find really helpful for learning
32:32 Fabric and I think would help you with
32:35 your journey as well okay so here I am
32:38 inside of the school Community learn
32:40 Microsoft Fabric and if you haven't been
32:42 here before I recommend you go here you
32:45 know we've got 15 members currently but
32:46 it's likely to go up a lot more I
32:48 haven't even advertised it yet so follow
32:50 the link in the description here I
32:51 basically answer any questions that
32:54 you've got and you can ask me any
32:55 questions or other people in the
32:57 community as well and and generally we
33:00 have lots of discussions about fabric so
33:02 if you want to learn fabric I recommend
33:05 you go here and in here I've got this
33:07 classroom content and it's called
33:09 introduction to Microsoft Fabric and
33:11 within this course you'll find all of
33:13 the notes about what I've just been
33:15 describing in this course and at the
33:17 bottom there's this getting started link
33:19 and here I describ two methods for how
33:21 you can get started with Microsoft
33:23 fabric the first one is to follow
33:25 through to the documentation that's the
33:27 Microsoft documentation to set up a free
33:30 Tri now this is good if you have admin
33:33 privileges within your existing company
33:35 or within your Microsoft tenant now if
33:38 you're just someone at home who wants to
33:41 set up their own fabric instance
33:43 especially with its own admin privileges
33:46 then there is a way to do this via the
33:49 M365 developer account I've linked to a
33:52 blog post here on data witches. comom
33:54 and this is a really good blog post
33:57 because it shows you how to set up an m
34:00 365 developer account and it is a little
34:02 bit long- winded but basically allows
34:04 you to get a fabric free trial with
34:07 admin privileges so really you can use
34:09 it to do whatever you want for 60 days
34:10 and in this blog post it tells you
34:12 exactly how to do that so I recommend
34:14 following these instructions and then
34:17 getting your sandbox environment set up
34:20 for fabric on top of that I've also got
34:24 this links to more resources and some of
34:27 these links are provided by Microsoft so
34:28 in their introduction to fabric
34:31 presentation they list these resources
34:33 so these are all Microsoft resources at
34:35 the top so links to the documentation
34:37 links to the ebook links to Microsoft
34:40 learn fabric modules some endtoend
34:42 scenarios and fabric notes if you're
34:44 more of a visual learner these are some
34:46 revision cards for Microsoft fabric that
34:48 you might find useful now on top of that
34:50 I've listed these personal
34:52 recommendations of what I think are good
34:54 resources for you to learn from so one
34:56 of the best ones that I use quite regular
34:57 regular
34:59 is the Azure synapse YouTube channel so
35:02 they've kind of rebranded let's say and
35:03 they're doing quite a lot of content on
35:05 Microsoft fabric particularly there's a
35:08 series called fabric espresso and what's
35:10 good about the is that they interview
35:12 the people working on uh the fabric
35:14 product teams about how their particular
35:16 product works or how a new feature works
35:18 so I definitely recommend you going
35:20 there next up is advancing analytics
35:22 YouTube channel and this is really good
35:23 particularly if you're interested in
35:26 learning more about lakeh House Lake
35:27 House principle
35:29 Simon there has got a lot of experience
35:31 in building and designing lake houses
35:34 and The Spark engine in particular as
35:35 well so if you looking to learn that
35:37 kind of stuff I would recommend
35:41 advancing analytics we also got katos bi
35:43 run by Chris Vagner which is an awesome
35:46 place to learn more about Fabric and
35:49 they have every Friday they have like a
35:51 live stream every well for me it's in
35:53 the mornings I think if you're in the US
35:54 it would be around lunchtime and yeah
35:58 it's just really good resource to learn
36:00 about fabric next up we have Tales from
36:02 the field and tales from the field again
36:05 is like a regular discussion and Roundup
36:07 about what's happening in the fabric
36:09 data community so if you don't have much
36:11 time to get your head around all the
36:13 different news and events and ways that
36:15 people are working with fabric and I
36:17 recommend Towers from the field it's a
36:19 great bunch of guys all working for
36:21 Microsoft all working with clients on
36:22 kind of figuring this stuff out as well
36:24 so there's some good insights in that
36:27 one as well I'd also recommend Fab bri.
36:31 Guru which is a Blog run by Sandy Po and
36:33 this is a really good resource for
36:36 helping you understand a wide range of
36:38 topics um especially Sandy loves going
36:40 into the some of the more technical
36:42 stuff figuring out how these systems can
36:45 integrate working out cool new workflows
36:47 that are available to us in Fabric and
36:50 he documents them really well in a lot
36:51 of detail so I definitely recommend you
36:54 checking that out another one is data
36:56 Mozart focusing mainly on powerbi but
36:59 but now is producing a lot more content
37:02 on fabric particularly recently around the
37:02 the
37:05 dp600 which is one of the certification
37:07 exams it's the only fabric certification
37:09 exam currently that allows you to become
37:13 a certified fabric analytics engineer so
37:14 if that you're interested in that I
37:15 definitely recommend checking out data
37:19 moar and the final one is a Blog by Sam
37:21 de bruy and this is basically a
37:23 collection of articles about fabric he
37:25 focuses I think previously he was
37:27 focusing on azure and kind of the
37:29 Microsoft data stack in general but now
37:31 Sam's been producing some of the best
37:33 like blog posts that I've read come from
37:36 Sam so I definitely recommend if you
37:38 like reading blog posts about fabric to
37:39 check this one out as well if you've
37:41 made it this far I want to say thank you
37:43 very much for watching and I would love
37:45 it if you leave a fire Emoji in the
37:47 comments just to let me know that you've
37:49 made it all the way to the end if you
37:52 have any questions then I recommend that
37:54 you join our learn Microsoft fabric
37:57 community and ask away the link is in