YouTube Transcript:
How To Get Free Traffic From ChatGPT in 2025 (AIO Course)
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
Available languages:
View:
Okay, so here I am in Chad GPT and if I
ask it like this question, what's the
best course on SAS on WordPress? You
will see WordPress SAS 2.0 my website
learn with Hassan will pop up in the
first result. You can see this is my
website and the source chpt.
So I'm getting free traffic directly
from Chad GPT for a prompt like this
one. Let's copy it. Open code enter
again you will see I'm the first result
on load go to perplexity again the same
prompt and you will see this is my
course on top of perplexity so my
website my course is ranking on top of
AI models in this video I will share
with you seven techniques to rank your
website on top of AI models like chaptt
and code and more importantly I will
share with you a free system I built so
you can track your performance your
website mentions inside AI models. If
you are ready, let's get started. Okay,
before we start and in like 30 seconds,
let's recap together how was organic
traffic before AI in 2023 and before. So
we have Google and we go and search for
keywords for example WordPress SAS
course. So this is a keyword that people
may search for and then you rank your
website on top of Google to get traffic
directly from Google organic free
traffic. So we used to do keyword
research. We use keyword research tools
like this one. I search for keywords. I
see the search volume and then I go here
to my website and optimize the page.
What we call onpage SEO and off-page
SEO. and you start optimizing to rank on
Google and more importantly we used to
track performance with Google search
console so you can see how many clicks
you are getting how many impressions and
so on so this is traditional SEO is it
in my opinion of course not and I still
get a lot of traffic from SEO today and
I believe it's still so important but
beside SEO today we have something new
AIO or AI optimization to get traffic
directly from AI models like chat, GPT
and other models. So as I mentioned in
SEO we use Google search console so we
can track our website performance or
search engines and track which keywords
are getting impressions and clicks. So
for AI we need something similar. We
need to track our website mentions on
chip and AI models. For example, I want
to know as a website owner that this
query here that this prompt is
generating a response that's mentioning
my website and so on. So I can track my
website performance on AI models. So, I
did some research to find some tools
that can track performance on AI models
and I found things like Almrefs starting
$79 a month. Another tool is called
keyword.com. Other one is called SE
Ranking starting with $95 a month. We
have AHF starting 129. We have peak or
peace, I don't know, 199 or 89 a month.
We have first answer starting for $39 a
month for only 10 monitored prompts and
so on. So we have several tools and
honestly I didn't test any of these
tools and they might be great. But in
terms of pricing I felt somehow
expensive especially for people starting
new and they have tight budget. So I
decided to create and build an
alternative. So the first step I will
show you how to set up this system so
you can track your performance on AI
models. Then I will share with you the
seven tactics to rank your website
inside AI models. In this way we have
like a full course on AI optimization
for free organic traffic from AI. So to
build this system this tracking system I
used make. I chose make first because
it's no code. It's super simple and it's
totally free to get started. So if you
are new here, make is a no code tool.
Allows you to build any automation you
want, any system you want in a super
simple way and I partner with make to
give you one month pro access totally
for free. Just sign up using the link in
the description and follow up with me.
In my case, let's log into my account.
And when you are here in the dashboard,
just click on scenarios and click on
create a new scenario. Now, our tracking
system consists of three scenarios. One
is for tracking that you'll see right
now. And the second two is to build the
user interface like this one, this
dashboard. You can see I click on this
button, we get this simple report to
track everything with this clean UI. So
let's go back to our scenario. And to
make things simple for you, I will share
with you in the description a link to
the JSON file that you can use to import
the scenario directly. So just click on
these three dots here and input the
blueprint. Choose file and then simply
select the blueprint, save, and boom,
you will get the scenario. Now, forget
about this part here. I'll talk about it
later. Our main flow is this one here.
Okay, it's super simple as you will see
right now. What's going on here simply
is the first module, the first step is
just reading the queries or the prompts
we want to track on AI. So if you go
here to make, we have something called
data store which is built in in make and
this is what makes make great. It has
all the features you want directly
inside without any external services. So
you can see here I created these four
databases or data stores or tables that
connects with our scenario. The idea is
super simple as you will see right now.
So we have Alam queries. If you browse
this one you will see I added four
queries like the four prompts I want to
track. Again if you go to charg here. So
this is our prompt. It's like keywords
in the old SEO. Today we have prompts or
what we call LM queries. So here you add
your queries and the second one is for
keywords. So here these are the keywords
or what you want to track. In my case I
want to track my website mentions. So I
added lemouth hassan and lethassan.com.
So these are the main two keywords I
want to track. And here we have two
other tables. One for keyword tracking.
So here you will see I'm tracking the
keywords when my website is mentioned
and I have the other table for brand
mentions. So this system not only tracks
your website, it also tracks your
competitors and this really game
changer. So you can see your competitors
where are ranking on AI. So also you can
optimize and outrank them. So we have
four data stores and it's super simple
to create such data store. If you click
on edit, you will see I have simply here
a query. That's all. Just click add data
store and you give it a name. For
example, lm queries. And here you click
on add and add an item. And simply add
query. That's it. And you click on save.
And you get this data store and so on.
For tracking, if you open this, we have
query again. We have keyword and we have
found which is a boolean value true or
false and we have the date and then the
lm if you are targeting multiple AI
models like shajyp and code and so on.
So we can track on which AI we are
monitoring and then save. The same for
keywords. It's very simple structure. We
have here a keyword only. And here we
have brand mentions. You will see we
have brand query date URL and that's it.
Four fields. It's super simple. You just
need to copy and paste these to build
your four data stores. So when you build
them you will see here the first one is
simply connected to LM queries. So we
are reading the prompts that we want to
track. The second step is simply an
iterator which loops and read one by
one. Then we go and call charg GPT. This
one please focus here. What I'm doing
simply is I'm calling charg with GPT40
search preview. So here please one
important point if you go to charge
again let's repeat the session new chat
copy the prompt please focus here you
will see that chargept is searching the
web you see search the web before
generating the answer so here we are
using a tool to search then it responds
the same here if you want to mimic this
behavior you need to use a search
preview you can also use the default
model without web search to track your
mentions on LMS without search. So this
is other model if you want we have model
with search and we have a model without
search. In my case I'm using the search
preview to mimic the same behavior on
chipt then I will create these two
openai modules to extract the data in a
formatted structure in JSON. You see
here I tell it return a JSON brands and
URL. So it's simple and it's easy for me
to read and save in a database. I will
show you now an example to understand
what's happening. This one here just
gets a JSON and formatted well and
converts to structured data. So these
are just for structuring the response we
get from here. Then when we get all the
mentions, all the brand mentions, we are
going to again iterate loop one by one
and save to our second data store. You
see here to brand mentions. So here we
are extracting all the brand mentions
and saving them inside the data store.
Then what I do is I want to check if my
keywords if my website is mentioned in
the responses. So you can see here I get
my keywords from the keywords table. So
I loop and here you can see we have a
filter. I check if the messages coming
from Chad GPT contains my value or my
keyword and I save it inside the
keywords tracking database. So again to
sum up in the first step you are going
to get all the prompts. In the first
part here we are going to extract all
the brand mentions every company
mentioned from the responses and save
them. Then this second part these three
modules will check only for my keywords
filter out the responses containing my
website and save it in this data store.
Let me run this system once inside make
to help you understand what's going on.
So I will run the first step is it will
get the queries. It will start looping
one by one. So this is the first
iteration. We are calling chajp with
search and extracting the responses. You
can see here we have the response
message content. You see this is the
chip response. You can see my website is
here. So this is the response from
charg. The second step is we are
extracting this response. You see here
choices we are just transforming the
response into this JSON. You see
structured JSON then this one is used to
ensure we have will turn it into
structured response that make understand
you see here now we have one the first
response second response third response
and so on. So we have now the full
results the full brand mentions
structured in JSON format. Then this we
have the iterator. Then it saves all the
records to data store. And here we are
filtering my mentions. You can see the
filter. It checks if I am mentioned and
if I am mentioned it stores the mentions
in data store. So if you go back here
and open for example brand mentions you
will see all the records that we got. We
have now 180 records, 100 brand mentions
saved inside this data store. So this is
the first scenario you want to import.
And one note, you need to configure all
these data stores. This one with LM
brand mentions and this one with LM
keywords and this one with LM keyword
tracking. And finally, don't forget to
set up your OpenAI key here. To add a
connection, you just add and enter your
API key. So you have access to open AI.
So the first scenario that tracks and
collects data is ready. You just need to
run it. You can schedule it for example
once a day or once a week. It's up to
you. And now we have the data saved. We
just need a way to read and view in this
simple dashboard. So we can track it
with a simple UI. So the second step is
to create these second two systems. This
one which are very basic. This one is to
read data from our keyword tracking and
return it as JSON here as an array. So
again I will leave the scenarios. You
can just import this is very simple. We
have a web hook which acts like a URL.
Let me show you this. If I open it gives
you a URL. And if I copy it and go and
open my browser here, paste, enter, you
will see I get the full data in JSON
format. You see in the browser. So this
is just like a system that allows me to
read data from the database inside make.
This is the first one. And the second
system that reads the brand mentions.
Same way we have a web hook or called an
API endpoint. A URL that you get and use
to read the full data in JSON format. Of
course, you are not going to read such
data in the browser. This is why I
created this simple interface for you.
You just need to enter your make URL
here. Click on fetch data and then it
will format it like the query, your
mention. You can see the keyword in
which language model and by date the
second keyword and so on. The same for
brand tracking. You just fetch data. It
will show you all your competitors all
the web pages ranking for the same
keyword on Chipd. You can see we have
now different websites. I can discover
my competitors, see who is building
something like my course and so on. And
to make things simple for you, you just
to get this simple page. And if you want
to get the source code, I will leave it
so you can run it locally if you want or
maybe optimize it and host it and have
your own system to track your mentions
on language models. So again, we have
three systems or three scenarios. one
for tracking, one for reading data brand
mentions and one for reading data from
keywords and this is the UI. So you have
the full system to track your
performance on AI models. Now before I
move on to the seven tactics to rank
your website after we are monitoring the
performance, I want to mention that you
can extend the system to track on any AI
you want. For example, you just need to
go here and add a router and then simply
for example, you see this route here.
You can instead instead of charg for
example anthropic load and clone the
same path here. For example, you can add
a route and you can search here for perplexity
perplexity
and you can add perplexity AI or simply
Gemini and so on. So you can track on
multiple AI systems. Just make sure here
when you want to save data to choose the
LLM, in my case it's GPD40, then you
just change it for Gemini or code or
whatever. So this way in the UI you can
see different LMS here in this section.
And what's nice about make they
introduced this new grid system where
you can see the full scenarios in this
awesome Iird view. So you can track
exactly what's going on inside your
systems and this is especially helpful
if you want to scale the system and even
build a backend for your micros with
make. Yes, with no code like make you
can build a full backend to your Micros
tools. You can see you can track
everything and switch to 2D. You can
turn light mode. You will get
notifications if anything is happening
and track your transfer view and spot
any problems easily inside your account
or your systems. Not only that, on my
website, I published this case study or
this stress test I made with make to
check if it can handle 1,000 requests
per minute simulating a realworld
microsass traffic. And you can see here,
you can read this blog and in the
results, you can see I got 100% success
rate in all tests. So we can say
comfortably make can handle 1,000
requests per minute and you can build a
full backend for your tools or microsass
with no code and make and the grid
system will definitely help you when you
scale your scenarios with make. One last
note I want to mention that this system
I built here is somehow simplified to
help you understand the concept. You can
add more fields you want to track for
example the history. So you can build a
chart. You can add more fields and track
more data within the system. You can
expand, optimize as you like and maybe
you can turn this into your own tool or
your own micros for LM tracking. Okay,
now it's time to see the seven tactics
to rank your website and put your
website inside AI responses putting it
in front of millions of users using AI
models every single day. But a small
disclaimer to be transparent. What I'm
about to share with you is not something
100% guaranteed simply because this
field this studies are still new and
there are limited test and research
around this topic and I myself am still
testing experimenting different tactics
to optimize my websites for AI and you
will see right now some examples I will
show you from my websites. I'm still
building and experimenting right now,
but I tried my best and did the
research, experiments, and tests to
share with you the most important
tactics and tips to rank your website on
AI and get free traffic from AI. So,
let's get started with tactic number
one, adding statistics, numbers, and
proof to your content. LLMs or AI models
are trained to give priority to content
with verifiable information and factual
responses when generating a response to
a prompt. So try your best whenever you
write a blog, a page, anything to
mention numbers and proof. For example,
maybe not the best example, but
something that we can rely on. This
test, this one, the stress test. Here
I'm sharing something I did a test
verified with numbers and I tried this
by the way while I was doing and testing
this I asked Chajip for something
related and you can see my website here
even my medium blog appeared in Chajip
results. So, it used my content to
respond to my prompt. And I think this
is because I have a real test, something
verifiable in my blog that LMS can rely
on to respond. So, this is the idea is
to write or create content when LM read
it, it can see verifiable information
and build what we call a trust
confidence score. This will help
consider your website as a source of
information when generating responses.
Tactic number two, engaging on Reddit,
forums and Kora. LMS or AI models are
trained on millions of Reddit
conversations and threads and Kora
questions and answers and user generated
content because it's considered or often
seen as authentic and unbiased content.
So language models and AI companies deal
with user content and discussions as a
source of authenticity. This is why you
should consider joining Reddit, forums,
Kora and communities. So you can give
trust signals to language models using
these sources. Also you can build your
own community, your own forum on your
website. So user engages, ask questions,
answers on your website and this may
give other trust signal to AI that your
website have authentic conversations and
will deal with it as a trusted source.
For example, on my website you see I
have a community, I have forums where
people engage, ask questions and discuss
topics related to marketing and digital
products. Tactic number three is fact
optimization or frequently asked
questions optimization. So before AI
usually people used to search Google for
keywords for example WordPress SAS like
a keyword but today people are more into
questions asking full questions like
what is the best online course to build
SAS on WordPress. So people will ask
full questions and this is where it
comes answering these questions on your
website. So for example in this blog
post here if you go down you will see I
have a fact sections answering common
questions people might ask about the
topic. So this way when chip or AI find
the exact same question on my website
with the answer combined with the trust
signals we mentioned before it may use
my response my website to answer the
user question. So you can search Google
and use tools to find questions people
are asking and then you can answer them
inside your blog posts and your web
pages. Even if you are building tools
for example like in my case toolbox I'm
building tools here make sure to add a
fact section like this one to answer
common questions about such tools.
People may ask questions like what is
the best keyword research tool and if I
answer this question here Jajipity may
use my answer to reference my web page
which which includes my tool and so on.
Tactic number four is what we call
creating comparison table or structured
data. These tables or comparison content
will help LM and AI models answer
questions like what is the best X or
what is the best product and so on. For
example, just yesterday I was working on
this new comparison page especially for
testing this AI traffic strategy. So you
can see I'm building this VPS companies
page on my website selfhostchool.com
and here I'm adding different VPS
providers with comparisons getting data
trusted data collecting real data from
the web and building this structured
table on my website hoping that whenever
someone asks for a VPS comparison or
what is the best VPS or so AI can use my
data my collected data my structure
structured data to answer people's
question and I can get traffic to my
website from AI using this strategy. So
in your niche find what type of
comparison data you can create and you
can create these tables or content
directly on your website and somehow
simple today with WordPress and some
custom snippets if you are following my
other videos on building SAS or
microsass or snippets on WordPress and I
will have a full video soon about
creating such libraries and directory
websites on WordPress in a full course
here on my channel. So don't forget the
notifications to get every new update.
Okay, let's now move to tactic number
five which is multi-platform authority
building. What I mean simply is you need
to create content today on different
platforms like in my case I create on my
website on YouTube on axe on medium. So
this way when language models see the
same content, same information in
different sources, it may build a trust
signal that this information is cross
referenced in different platforms and
there's a good signal for the language
model generate content and responses
based on this data. So pick your
platforms like three or four platforms
and make sure at least once a week to
repurpose your content on different
platforms to give signal to AI that your
content or your brand is mentioned
somehow everywhere and is trusted in
different sources. Tact number six,
update signals. In short, whenever you
create a blog post or a web page, just
make sure to keep your content fresh and
new. And make sure to show the updated
date or created date on your web pages.
So, language models when scraping and
reading your web pages can know that
this content is new and fresh and it can
rely on when generating content. So in
short, just keep your content updated
and mention the update date inside your
web pages. Tactic number seven,
something called JSON DL, which is a
JSON representation of the web page
describing what's there in the web page.
This will help the model understand the
structure and read the web page in a
structured format that AI loves and
search engines in general love. For
example, let me show you this. If you go
back here to my web pages, course page,
for example, this one. If I go here to
view page source and go down, you will
see here I have this simple script
including this JSON that contains every
information what's going on and what
exactly is this page about. For example,
what this course teaches, the keywords,
the location, the audience type, and so
on. So when scrapers, AI scrapers read
the web page and find this JSON DL
structured data inside your web page, it
will be easier for the AI to understand
what this web page is about. And to
build this, it's somehow simple with
WordPress. I use code snippets plugin. I
just create a new snippet and you can
see here WP JSON DL. I just added this
snippet here. You can see I added this
and you can generate this with AI. I
will show you how I did this in seconds.
And then when you go to your web page,
if I edit the page, you will see here I
have this simple short code which I got
from code snippets. Here you see it gave
me this short code. I just paste here
and then the JSON DL will be implemented
within the web page. Now an easy way to
create this JSON DL. For example, what I
did is I copied the content of this web
page. Yes, it's that dummy and ask
jajipity generate a JSON DL for this web page.
page.
This is how I created it and then you
will get it. You see it will generate it
and simply add a snippet and add to your
web page. Let's now move to the bonus
tip. Go back here to toolbox.com,
my website, and I created a free tool
for you called LM query generator. This
will help you generate suggested prompts
that people may use to find your web
page. Again, if I go here and copy this
web page, you just need to paste any web
page you want on the web and click on
generate LM queries. Now using AI, it
will scrape the web page content and it
will generate like up to 20 prompts that
you can use or at least brainstorm that
people may use to find your web page. So
you can use these prompts to track. You
see here, how can I build a SAS
application using WordPress? So people
may ask this question to CHPT and your
website may appear on this query. It's
like keyword research but for AI. So
here we have like 20 different AI
prompts and you can copy some of these
prompts. Copy and use in our system that
we built. Here we can go to alamic
queries and then sorry browse and then
add them here inside this table to track
with our system. So use this tool to
discover prompts that might be used to
find your web pages and websites. I hope
you learned something new today. If you
did, smash the like button. And I think
now you'll be interested in watching
this video and building a full microsass
with WordPress and no code like me.
Thank you for following and see you in
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.
Works with YouTube, Coursera, Udemy and more educational platforms
Get Instant Transcripts: Just Edit the Domain in Your Address Bar!
YouTube
←
→
↻
https://www.youtube.com/watch?v=UF8uR6Z6KLc
YoutubeToText
←
→
↻
https://youtubetotext.net/watch?v=UF8uR6Z6KLc