0:03 Okay, so here I am in Chad GPT and if I
0:05 ask it like this question, what's the
0:08 best course on SAS on WordPress? You
0:12 will see WordPress SAS 2.0 my website
0:15 learn with Hassan will pop up in the
0:17 first result. You can see this is my
0:20 website and the source chpt.
0:22 So I'm getting free traffic directly
0:25 from Chad GPT for a prompt like this
0:30 one. Let's copy it. Open code enter
0:33 again you will see I'm the first result
0:37 on load go to perplexity again the same
0:39 prompt and you will see this is my
0:42 course on top of perplexity so my
0:45 website my course is ranking on top of
0:48 AI models in this video I will share
0:51 with you seven techniques to rank your
0:54 website on top of AI models like chaptt
0:56 and code and more importantly I will
0:59 share with you a free system I built so
1:01 you can track your performance your
1:04 website mentions inside AI models. If
1:08 you are ready, let's get started. Okay,
1:10 before we start and in like 30 seconds,
1:13 let's recap together how was organic
1:18 traffic before AI in 2023 and before. So
1:21 we have Google and we go and search for
1:24 keywords for example WordPress SAS
1:27 course. So this is a keyword that people
1:30 may search for and then you rank your
1:33 website on top of Google to get traffic
1:36 directly from Google organic free
1:38 traffic. So we used to do keyword
1:40 research. We use keyword research tools
1:44 like this one. I search for keywords. I
1:47 see the search volume and then I go here
1:51 to my website and optimize the page.
1:53 What we call onpage SEO and off-page
1:56 SEO. and you start optimizing to rank on
1:59 Google and more importantly we used to
2:01 track performance with Google search
2:04 console so you can see how many clicks
2:07 you are getting how many impressions and
2:12 so on so this is traditional SEO is it
2:14 in my opinion of course not and I still
2:17 get a lot of traffic from SEO today and
2:21 I believe it's still so important but
2:25 beside SEO today we have something new
2:30 AIO or AI optimization to get traffic
2:34 directly from AI models like chat, GPT
2:36 and other models. So as I mentioned in
2:39 SEO we use Google search console so we
2:42 can track our website performance or
2:44 search engines and track which keywords
2:47 are getting impressions and clicks. So
2:50 for AI we need something similar. We
2:54 need to track our website mentions on
2:58 chip and AI models. For example, I want
3:00 to know as a website owner that this
3:02 query here that this prompt is
3:05 generating a response that's mentioning
3:08 my website and so on. So I can track my
3:13 website performance on AI models. So, I
3:15 did some research to find some tools
3:18 that can track performance on AI models
3:22 and I found things like Almrefs starting
3:25 $79 a month. Another tool is called
3:28 keyword.com. Other one is called SE
3:31 Ranking starting with $95 a month. We
3:36 have AHF starting 129. We have peak or
3:39 peace, I don't know, 199 or 89 a month.
3:43 We have first answer starting for $39 a
3:46 month for only 10 monitored prompts and
3:48 so on. So we have several tools and
3:50 honestly I didn't test any of these
3:53 tools and they might be great. But in
3:56 terms of pricing I felt somehow
3:59 expensive especially for people starting
4:02 new and they have tight budget. So I
4:04 decided to create and build an
4:06 alternative. So the first step I will
4:09 show you how to set up this system so
4:11 you can track your performance on AI
4:13 models. Then I will share with you the
4:16 seven tactics to rank your website
4:19 inside AI models. In this way we have
4:22 like a full course on AI optimization
4:26 for free organic traffic from AI. So to
4:29 build this system this tracking system I
4:31 used make. I chose make first because
4:34 it's no code. It's super simple and it's
4:37 totally free to get started. So if you
4:40 are new here, make is a no code tool.
4:42 Allows you to build any automation you
4:44 want, any system you want in a super
4:47 simple way and I partner with make to
4:50 give you one month pro access totally
4:52 for free. Just sign up using the link in
4:55 the description and follow up with me.
4:58 In my case, let's log into my account.
5:00 And when you are here in the dashboard,
5:03 just click on scenarios and click on
5:06 create a new scenario. Now, our tracking
5:09 system consists of three scenarios. One
5:12 is for tracking that you'll see right
5:15 now. And the second two is to build the
5:16 user interface like this one, this
5:18 dashboard. You can see I click on this
5:21 button, we get this simple report to
5:24 track everything with this clean UI. So
5:26 let's go back to our scenario. And to
5:29 make things simple for you, I will share
5:31 with you in the description a link to
5:33 the JSON file that you can use to import
5:35 the scenario directly. So just click on
5:37 these three dots here and input the
5:41 blueprint. Choose file and then simply
5:45 select the blueprint, save, and boom,
5:47 you will get the scenario. Now, forget
5:50 about this part here. I'll talk about it
5:53 later. Our main flow is this one here.
5:55 Okay, it's super simple as you will see
5:58 right now. What's going on here simply
6:01 is the first module, the first step is
6:04 just reading the queries or the prompts
6:07 we want to track on AI. So if you go
6:09 here to make, we have something called
6:11 data store which is built in in make and
6:14 this is what makes make great. It has
6:16 all the features you want directly
6:18 inside without any external services. So
6:20 you can see here I created these four
6:23 databases or data stores or tables that
6:26 connects with our scenario. The idea is
6:28 super simple as you will see right now.
6:31 So we have Alam queries. If you browse
6:33 this one you will see I added four
6:36 queries like the four prompts I want to
6:39 track. Again if you go to charg here. So
6:42 this is our prompt. It's like keywords
6:46 in the old SEO. Today we have prompts or
6:49 what we call LM queries. So here you add
6:53 your queries and the second one is for
6:57 keywords. So here these are the keywords
6:59 or what you want to track. In my case I
7:01 want to track my website mentions. So I
7:03 added lemouth hassan and lethassan.com.
7:05 So these are the main two keywords I
7:08 want to track. And here we have two
7:11 other tables. One for keyword tracking.
7:13 So here you will see I'm tracking the
7:16 keywords when my website is mentioned
7:19 and I have the other table for brand
7:22 mentions. So this system not only tracks
7:25 your website, it also tracks your
7:27 competitors and this really game
7:30 changer. So you can see your competitors
7:33 where are ranking on AI. So also you can
7:37 optimize and outrank them. So we have
7:39 four data stores and it's super simple
7:41 to create such data store. If you click
7:45 on edit, you will see I have simply here
7:48 a query. That's all. Just click add data
7:50 store and you give it a name. For
7:53 example, lm queries. And here you click
7:57 on add and add an item. And simply add
8:01 query. That's it. And you click on save.
8:03 And you get this data store and so on.
8:06 For tracking, if you open this, we have
8:11 query again. We have keyword and we have
8:14 found which is a boolean value true or
8:17 false and we have the date and then the
8:20 lm if you are targeting multiple AI
8:22 models like shajyp and code and so on.
8:25 So we can track on which AI we are
8:28 monitoring and then save. The same for
8:30 keywords. It's very simple structure. We
8:33 have here a keyword only. And here we
8:36 have brand mentions. You will see we
8:42 have brand query date URL and that's it.
8:44 Four fields. It's super simple. You just
8:48 need to copy and paste these to build
8:52 your four data stores. So when you build
8:55 them you will see here the first one is
8:59 simply connected to LM queries. So we
9:01 are reading the prompts that we want to
9:03 track. The second step is simply an
9:07 iterator which loops and read one by
9:11 one. Then we go and call charg GPT. This
9:13 one please focus here. What I'm doing
9:18 simply is I'm calling charg with GPT40
9:20 search preview. So here please one
9:23 important point if you go to charge
9:25 again let's repeat the session new chat
9:28 copy the prompt please focus here you
9:31 will see that chargept is searching the
9:33 web you see search the web before
9:36 generating the answer so here we are
9:40 using a tool to search then it responds
9:42 the same here if you want to mimic this
9:44 behavior you need to use a search
9:47 preview you can also use the default
9:50 model without web search to track your
9:54 mentions on LMS without search. So this
9:56 is other model if you want we have model
9:58 with search and we have a model without
10:00 search. In my case I'm using the search
10:03 preview to mimic the same behavior on
10:08 chipt then I will create these two
10:12 openai modules to extract the data in a
10:15 formatted structure in JSON. You see
10:18 here I tell it return a JSON brands and
10:22 URL. So it's simple and it's easy for me
10:24 to read and save in a database. I will
10:26 show you now an example to understand
10:28 what's happening. This one here just
10:31 gets a JSON and formatted well and
10:33 converts to structured data. So these
10:36 are just for structuring the response we
10:39 get from here. Then when we get all the
10:42 mentions, all the brand mentions, we are
10:45 going to again iterate loop one by one
10:47 and save to our second data store. You
10:50 see here to brand mentions. So here we
10:53 are extracting all the brand mentions
10:55 and saving them inside the data store.
10:58 Then what I do is I want to check if my
11:01 keywords if my website is mentioned in
11:04 the responses. So you can see here I get
11:07 my keywords from the keywords table. So
11:10 I loop and here you can see we have a
11:13 filter. I check if the messages coming
11:17 from Chad GPT contains my value or my
11:20 keyword and I save it inside the
11:23 keywords tracking database. So again to
11:25 sum up in the first step you are going
11:28 to get all the prompts. In the first
11:31 part here we are going to extract all
11:33 the brand mentions every company
11:36 mentioned from the responses and save
11:38 them. Then this second part these three
11:42 modules will check only for my keywords
11:44 filter out the responses containing my
11:49 website and save it in this data store.
11:52 Let me run this system once inside make
11:55 to help you understand what's going on.
11:58 So I will run the first step is it will
12:02 get the queries. It will start looping
12:03 one by one. So this is the first
12:07 iteration. We are calling chajp with
12:10 search and extracting the responses. You
12:14 can see here we have the response
12:17 message content. You see this is the
12:19 chip response. You can see my website is
12:21 here. So this is the response from
12:24 charg. The second step is we are
12:27 extracting this response. You see here
12:30 choices we are just transforming the
12:32 response into this JSON. You see
12:36 structured JSON then this one is used to
12:39 ensure we have will turn it into
12:42 structured response that make understand
12:44 you see here now we have one the first
12:47 response second response third response
12:51 and so on. So we have now the full
12:53 results the full brand mentions
12:56 structured in JSON format. Then this we
12:59 have the iterator. Then it saves all the
13:01 records to data store. And here we are
13:03 filtering my mentions. You can see the
13:06 filter. It checks if I am mentioned and
13:08 if I am mentioned it stores the mentions
13:11 in data store. So if you go back here
13:15 and open for example brand mentions you
13:18 will see all the records that we got. We
13:22 have now 180 records, 100 brand mentions
13:25 saved inside this data store. So this is
13:27 the first scenario you want to import.
13:30 And one note, you need to configure all
13:32 these data stores. This one with LM
13:35 brand mentions and this one with LM
13:38 keywords and this one with LM keyword
13:41 tracking. And finally, don't forget to
13:44 set up your OpenAI key here. To add a
13:46 connection, you just add and enter your
13:49 API key. So you have access to open AI.
13:52 So the first scenario that tracks and
13:55 collects data is ready. You just need to
13:58 run it. You can schedule it for example
14:00 once a day or once a week. It's up to
14:03 you. And now we have the data saved. We
14:08 just need a way to read and view in this
14:11 simple dashboard. So we can track it
14:14 with a simple UI. So the second step is
14:17 to create these second two systems. This
14:20 one which are very basic. This one is to
14:24 read data from our keyword tracking and
14:27 return it as JSON here as an array. So
14:30 again I will leave the scenarios. You
14:32 can just import this is very simple. We
14:36 have a web hook which acts like a URL.
14:38 Let me show you this. If I open it gives
14:42 you a URL. And if I copy it and go and
14:45 open my browser here, paste, enter, you
14:48 will see I get the full data in JSON
14:50 format. You see in the browser. So this
14:53 is just like a system that allows me to
14:56 read data from the database inside make.
14:58 This is the first one. And the second
15:01 system that reads the brand mentions.
15:04 Same way we have a web hook or called an
15:09 API endpoint. A URL that you get and use
15:12 to read the full data in JSON format. Of
15:14 course, you are not going to read such
15:17 data in the browser. This is why I
15:20 created this simple interface for you.
15:22 You just need to enter your make URL
15:25 here. Click on fetch data and then it
15:27 will format it like the query, your
15:29 mention. You can see the keyword in
15:31 which language model and by date the
15:34 second keyword and so on. The same for
15:37 brand tracking. You just fetch data. It
15:39 will show you all your competitors all
15:42 the web pages ranking for the same
15:44 keyword on Chipd. You can see we have
15:47 now different websites. I can discover
15:50 my competitors, see who is building
15:53 something like my course and so on. And
15:55 to make things simple for you, you just
16:03 to get this simple page. And if you want
16:05 to get the source code, I will leave it
16:07 so you can run it locally if you want or
16:10 maybe optimize it and host it and have
16:13 your own system to track your mentions
16:15 on language models. So again, we have
16:18 three systems or three scenarios. one
16:21 for tracking, one for reading data brand
16:24 mentions and one for reading data from
16:26 keywords and this is the UI. So you have
16:28 the full system to track your
16:32 performance on AI models. Now before I
16:35 move on to the seven tactics to rank
16:37 your website after we are monitoring the
16:39 performance, I want to mention that you
16:44 can extend the system to track on any AI
16:47 you want. For example, you just need to
16:52 go here and add a router and then simply
16:54 for example, you see this route here.
16:57 You can instead instead of charg for
17:00 example anthropic load and clone the
17:04 same path here. For example, you can add
17:06 a route and you can search here for perplexity
17:08 perplexity
17:12 and you can add perplexity AI or simply
17:16 Gemini and so on. So you can track on
17:19 multiple AI systems. Just make sure here
17:21 when you want to save data to choose the
17:24 LLM, in my case it's GPD40, then you
17:28 just change it for Gemini or code or
17:31 whatever. So this way in the UI you can
17:34 see different LMS here in this section.
17:36 And what's nice about make they
17:39 introduced this new grid system where
17:42 you can see the full scenarios in this
17:44 awesome Iird view. So you can track
17:47 exactly what's going on inside your
17:49 systems and this is especially helpful
17:52 if you want to scale the system and even
17:55 build a backend for your micros with
17:58 make. Yes, with no code like make you
18:01 can build a full backend to your Micros
18:03 tools. You can see you can track
18:05 everything and switch to 2D. You can
18:07 turn light mode. You will get
18:09 notifications if anything is happening
18:12 and track your transfer view and spot
18:15 any problems easily inside your account
18:18 or your systems. Not only that, on my
18:20 website, I published this case study or
18:23 this stress test I made with make to
18:25 check if it can handle 1,000 requests
18:28 per minute simulating a realworld
18:31 microsass traffic. And you can see here,
18:32 you can read this blog and in the
18:35 results, you can see I got 100% success
18:37 rate in all tests. So we can say
18:40 comfortably make can handle 1,000
18:42 requests per minute and you can build a
18:45 full backend for your tools or microsass
18:48 with no code and make and the grid
18:50 system will definitely help you when you
18:53 scale your scenarios with make. One last
18:55 note I want to mention that this system
18:58 I built here is somehow simplified to
19:00 help you understand the concept. You can
19:02 add more fields you want to track for
19:04 example the history. So you can build a
19:07 chart. You can add more fields and track
19:10 more data within the system. You can
19:13 expand, optimize as you like and maybe
19:15 you can turn this into your own tool or
19:19 your own micros for LM tracking. Okay,
19:21 now it's time to see the seven tactics
19:23 to rank your website and put your
19:27 website inside AI responses putting it
19:30 in front of millions of users using AI
19:33 models every single day. But a small
19:36 disclaimer to be transparent. What I'm
19:38 about to share with you is not something
19:41 100% guaranteed simply because this
19:43 field this studies are still new and
19:46 there are limited test and research
19:49 around this topic and I myself am still
19:52 testing experimenting different tactics
19:54 to optimize my websites for AI and you
19:57 will see right now some examples I will
19:59 show you from my websites. I'm still
20:02 building and experimenting right now,
20:04 but I tried my best and did the
20:06 research, experiments, and tests to
20:08 share with you the most important
20:11 tactics and tips to rank your website on
20:15 AI and get free traffic from AI. So,
20:18 let's get started with tactic number
20:21 one, adding statistics, numbers, and
20:25 proof to your content. LLMs or AI models
20:28 are trained to give priority to content
20:31 with verifiable information and factual
20:34 responses when generating a response to
20:36 a prompt. So try your best whenever you
20:39 write a blog, a page, anything to
20:42 mention numbers and proof. For example,
20:44 maybe not the best example, but
20:47 something that we can rely on. This
20:50 test, this one, the stress test. Here
20:54 I'm sharing something I did a test
20:58 verified with numbers and I tried this
21:01 by the way while I was doing and testing
21:04 this I asked Chajip for something
21:07 related and you can see my website here
21:10 even my medium blog appeared in Chajip
21:13 results. So, it used my content to
21:16 respond to my prompt. And I think this
21:19 is because I have a real test, something
21:23 verifiable in my blog that LMS can rely
21:26 on to respond. So, this is the idea is
21:29 to write or create content when LM read
21:33 it, it can see verifiable information
21:35 and build what we call a trust
21:38 confidence score. This will help
21:41 consider your website as a source of
21:44 information when generating responses.
21:48 Tactic number two, engaging on Reddit,
21:53 forums and Kora. LMS or AI models are
21:55 trained on millions of Reddit
21:58 conversations and threads and Kora
22:00 questions and answers and user generated
22:04 content because it's considered or often
22:08 seen as authentic and unbiased content.
22:12 So language models and AI companies deal
22:15 with user content and discussions as a
22:18 source of authenticity. This is why you
22:23 should consider joining Reddit, forums,
22:26 Kora and communities. So you can give
22:29 trust signals to language models using
22:32 these sources. Also you can build your
22:34 own community, your own forum on your
22:38 website. So user engages, ask questions,
22:41 answers on your website and this may
22:44 give other trust signal to AI that your
22:48 website have authentic conversations and
22:51 will deal with it as a trusted source.
22:53 For example, on my website you see I
22:56 have a community, I have forums where
22:59 people engage, ask questions and discuss
23:02 topics related to marketing and digital
23:05 products. Tactic number three is fact
23:08 optimization or frequently asked
23:11 questions optimization. So before AI
23:14 usually people used to search Google for
23:18 keywords for example WordPress SAS like
23:22 a keyword but today people are more into
23:24 questions asking full questions like
23:27 what is the best online course to build
23:30 SAS on WordPress. So people will ask
23:33 full questions and this is where it
23:35 comes answering these questions on your
23:38 website. So for example in this blog
23:41 post here if you go down you will see I
23:44 have a fact sections answering common
23:47 questions people might ask about the
23:52 topic. So this way when chip or AI find
23:56 the exact same question on my website
23:59 with the answer combined with the trust
24:02 signals we mentioned before it may use
24:05 my response my website to answer the
24:09 user question. So you can search Google
24:13 and use tools to find questions people
24:15 are asking and then you can answer them
24:18 inside your blog posts and your web
24:20 pages. Even if you are building tools
24:23 for example like in my case toolbox I'm
24:26 building tools here make sure to add a
24:29 fact section like this one to answer
24:32 common questions about such tools.
24:34 People may ask questions like what is
24:37 the best keyword research tool and if I
24:41 answer this question here Jajipity may
24:44 use my answer to reference my web page
24:47 which which includes my tool and so on.
24:50 Tactic number four is what we call
24:53 creating comparison table or structured
24:57 data. These tables or comparison content
25:00 will help LM and AI models answer
25:03 questions like what is the best X or
25:06 what is the best product and so on. For
25:09 example, just yesterday I was working on
25:12 this new comparison page especially for
25:16 testing this AI traffic strategy. So you
25:18 can see I'm building this VPS companies
25:21 page on my website selfhostchool.com
25:24 and here I'm adding different VPS
25:28 providers with comparisons getting data
25:30 trusted data collecting real data from
25:32 the web and building this structured
25:35 table on my website hoping that whenever
25:39 someone asks for a VPS comparison or
25:43 what is the best VPS or so AI can use my
25:45 data my collected data my structure
25:48 structured data to answer people's
25:50 question and I can get traffic to my
25:54 website from AI using this strategy. So
25:57 in your niche find what type of
25:59 comparison data you can create and you
26:01 can create these tables or content
26:04 directly on your website and somehow
26:07 simple today with WordPress and some
26:09 custom snippets if you are following my
26:11 other videos on building SAS or
26:14 microsass or snippets on WordPress and I
26:16 will have a full video soon about
26:19 creating such libraries and directory
26:21 websites on WordPress in a full course
26:23 here on my channel. So don't forget the
26:26 notifications to get every new update.
26:29 Okay, let's now move to tactic number
26:32 five which is multi-platform authority
26:35 building. What I mean simply is you need
26:38 to create content today on different
26:41 platforms like in my case I create on my
26:45 website on YouTube on axe on medium. So
26:48 this way when language models see the
26:50 same content, same information in
26:53 different sources, it may build a trust
26:56 signal that this information is cross
26:59 referenced in different platforms and
27:01 there's a good signal for the language
27:04 model generate content and responses
27:07 based on this data. So pick your
27:10 platforms like three or four platforms
27:13 and make sure at least once a week to
27:15 repurpose your content on different
27:18 platforms to give signal to AI that your
27:20 content or your brand is mentioned
27:23 somehow everywhere and is trusted in
27:27 different sources. Tact number six,
27:30 update signals. In short, whenever you
27:33 create a blog post or a web page, just
27:37 make sure to keep your content fresh and
27:39 new. And make sure to show the updated
27:43 date or created date on your web pages.
27:45 So, language models when scraping and
27:47 reading your web pages can know that
27:51 this content is new and fresh and it can
27:54 rely on when generating content. So in
27:57 short, just keep your content updated
28:00 and mention the update date inside your
28:03 web pages. Tactic number seven,
28:07 something called JSON DL, which is a
28:09 JSON representation of the web page
28:12 describing what's there in the web page.
28:15 This will help the model understand the
28:18 structure and read the web page in a
28:21 structured format that AI loves and
28:23 search engines in general love. For
28:25 example, let me show you this. If you go
28:27 back here to my web pages, course page,
28:30 for example, this one. If I go here to
28:34 view page source and go down, you will
28:37 see here I have this simple script
28:40 including this JSON that contains every
28:43 information what's going on and what
28:46 exactly is this page about. For example,
28:49 what this course teaches, the keywords,
28:52 the location, the audience type, and so
28:55 on. So when scrapers, AI scrapers read
28:58 the web page and find this JSON DL
29:01 structured data inside your web page, it
29:03 will be easier for the AI to understand
29:05 what this web page is about. And to
29:08 build this, it's somehow simple with
29:10 WordPress. I use code snippets plugin. I
29:12 just create a new snippet and you can
29:17 see here WP JSON DL. I just added this
29:19 snippet here. You can see I added this
29:21 and you can generate this with AI. I
29:24 will show you how I did this in seconds.
29:28 And then when you go to your web page,
29:31 if I edit the page, you will see here I
29:33 have this simple short code which I got
29:36 from code snippets. Here you see it gave
29:38 me this short code. I just paste here
29:42 and then the JSON DL will be implemented
29:45 within the web page. Now an easy way to
29:49 create this JSON DL. For example, what I
29:52 did is I copied the content of this web
29:55 page. Yes, it's that dummy and ask
30:00 jajipity generate a JSON DL for this web page.
30:02 page.
30:04 This is how I created it and then you
30:07 will get it. You see it will generate it
30:11 and simply add a snippet and add to your
30:13 web page. Let's now move to the bonus
30:16 tip. Go back here to toolbox.com,
30:19 my website, and I created a free tool
30:22 for you called LM query generator. This
30:25 will help you generate suggested prompts
30:28 that people may use to find your web
30:31 page. Again, if I go here and copy this
30:34 web page, you just need to paste any web
30:35 page you want on the web and click on
30:39 generate LM queries. Now using AI, it
30:41 will scrape the web page content and it
30:44 will generate like up to 20 prompts that
30:47 you can use or at least brainstorm that
30:50 people may use to find your web page. So
30:53 you can use these prompts to track. You
30:54 see here, how can I build a SAS
30:56 application using WordPress? So people
30:59 may ask this question to CHPT and your
31:02 website may appear on this query. It's
31:06 like keyword research but for AI. So
31:09 here we have like 20 different AI
31:12 prompts and you can copy some of these
31:15 prompts. Copy and use in our system that
31:17 we built. Here we can go to alamic
31:21 queries and then sorry browse and then
31:25 add them here inside this table to track
31:27 with our system. So use this tool to
31:30 discover prompts that might be used to
31:34 find your web pages and websites. I hope
31:35 you learned something new today. If you
31:38 did, smash the like button. And I think
31:39 now you'll be interested in watching
31:42 this video and building a full microsass
31:46 with WordPress and no code like me.
31:48 Thank you for following and see you in