To STUDY - Technical - LMS

To Study - Technical - Dorin M

"Let's make something!"

SEO Tools - Introduction

Translation Draft

Another particularly important detail in terms of the creation of a website is the "framing" of what would be the "presence" of the site in the search engine responses.

For example, Google is a fully automated search engine that uses a suite of software known as "web crawlers", "spiders", etc. (similar to the Bing search engine, and others). With these softwares, all existing pages in the web environment are periodically indexed automatically and free of charge (there are also web pages that are indexed by submit).

Specifically, when we do a search in the web environment, we actually do a search in the indexed pages of the search engine that will provide us with links to the indexed sites/pages. And this indexing does not include all web pages but only those identified by the crawlers that have been indexed.

The presentation of these lists obviously cannot be done in alphabetical order or in the order of "arrival" in the web space, or other criterion. There must be something "behind" to give the optimal answer to the user, the one who is looking for something in the web space.

And that's why those who own, support, etc. search engines will make everything as reliable as possible, as useful, permanently developing the algorithms, software, etc. that lie behind the operation of these search engines.

In addition, the interests of developers who need financial resources to support a real industry that "behind" search engines. Which resources are obtained from advertising/promotion, etc. I won't elaborate because this is the problem of Google, Microsoft, etc., but I stress that the results given by search engines are mostly (almost completely) useful and impartial.

Useful for you would be to understand and how these indexing softwares work (crawlers, spiders, etc.).

They access the main page, usually "index".html, htm, php, asp, etc. and from here they begin to index all the links that are inserted on this page, and index the links that appear on these new pages, and so on...

So, the excessive branching of the website, the so-called "broken links", etc. would be the first cause of the lack of indexing of existing pages in the web space, the presence of your site in this space.

So indexing and displaying results would be the main "unseen" activity that underpins search engines and what we can consider to be the accessible web space.

I said "main" because there is obviously also a result consisting of the analysis, synthesis, processing, etc. of these results, "delivered" through specific tools such as "Google Search Console", "Microsoft Bing Webmaster Tools", etc.

Returning to "indexing", searching for a user on a specific "search term" can appear a huge number of results.

This is where the response algorithms come in, which, at the time of this documentation, have more than 200 selection criteria to give an answer as close as possible to the wishes of the seeker.

Well, these algorithms take into account, first and foremost, the existence of the similarity of the search term (word/keywords) in the title of the page, in the actual address of the web page (URL address), how many times the keywords appear in the content of the page, whether there are synonyms for keywords on the page, whether that page comes from a quality site or from a "doubtful" site, what "Page Rank" has that page, etc.

For example, analyzing a search term is primarily about understanding the meaning of the search, what the word or words in a user's query means.

And so it comes down to language patterns, to the interpretation of spelling errors, to the attempt to understand what type of query was used, a system of synonyms that results in immense efforts towards understanding natural language, understanding words with multiple meanings... Many, many criteria...

Important would also be novelty algorithms that tend to present quick responses based on the novelty of the materials identified in the web environment, related to the search terms used. It is a useful "response" to users looking for something through search engines but it can be a huge disadvantage for site makers.

Bringing together these factors will result in a kind of overall score of each page and this score will be the basis for the "favorable" display of your web pages in the "response" pages of search engines.

So the more you want to comply with these "criteria"/algorithms, the more "favorable" you will appear in the search engine responses and thus the more hits you will have.

And all these algorithms have their importance, greater or lesser, but there is and can increase the chances of the actual presence of your web pages in the web environment.

For example, "PageRank" is an algorithm that evaluates the importance of a web page based on the number of external links that send to that page and the importance of those links.

Another particularly important point is that it does not matter when you intervene in this correct and effective indexation.

Crawlers/spiders initially index the web pages they discover in the web space, as new pages, "storing" their URLs in lists that will track them, update/update, etc. permanently. After initial indexation they will track each change made to them (pages already indexed) and make the necessary changes to existing (stored) URL lists.

So far, everything is OK and our pages are present in the possible search engine responses, without problems.

So you can always exploit the optimal opportunity to refine the response to a user's search (on a search engine) to "serve" the best answer?!?

What interests us is, how can we use these algorithms to make our web pages as accessible as possible by users, under the most onerous conditions.

You can develop a site of inestimable value, a kind of huge nugget of gold, but it is not "seen" by those who seek such nuggets. And that's because you exist in the search engine responses, but...

These search engines "show" you somewhere on page "x" of the answers, where "x" is a much larger number than the "y" representing the patience or time of the user "designed" in the number of "response" pages of the search engines.

Studies "show" that the "searcher's" patience on a search engine leads somewhere to the first 10 pages of answers (and, believe me, 10 is much "inflated" for the optimism of this exposure).

So, our nugget, if it appears somewhere, on page 15, is like being in the center of the earth, inaccessible to almost anyone.

This is where the "SEO" optimizations that we can use, adopt, implement, etc. come in that will attract the improvement of the "presence" of our web pages at the level of search engines, which will bring our nugget as "to the surface", becoming accessible to anyone who is looking for such a thing.

Do not "look" with indifference these optimizations! Even if you develop hobby web pages it is obvious that you will have an interest in "back" if you invest your life time in something like this. But if we talk about a business site, blog, etc. everything gets more serious than you would imagine.

A significant detail would be that the web environment is like a library whose number of pages, books, brochures, etc. whose number is constantly increasing (already "discussed" by hundreds of billions of pages) and the lack of "reaction"... It's not very good...

It really takes, necessary, etc. to pay attention to these details! Don't forget the "comparison" with the nugget in the center of the earth. You need real, active presence on the net, regardless of considerations like: "I communicate to customers, friends, etc. address and I am satisfied"... Or other blah, blah.

Even the "famous" social networks have a kind of "algorithm" (which has little to do with the SEO tools of web pages) that allows "explosions" of hits, such as "tag", blah, blah... But here everything is much simpler, being an algorithm based on the number of "social connections" and the number of "reactions" type distribution, likes, blah, blah.

STEP 1

Well, the first step in this algorithm works is to "sign up", create an account, type "webmaster tools"(Google Search Console, Microsoft Bing Webmaster Tools).

This account is created like any similar account creation operation, only the obligation to demonstrate that you own the site you "enroll" on these utilities will appear. Therefore, you will be provided with a file that you will upload to the main site directory/folder, usually in "public_html" on the server where you have the hosting/hosting of the site (web pages).

The procedure is practically instantaneous and will not take you more than 15 minutes to complete it (obviously if you have all the elements you need: a hosting, a website and the relationship elements of an assembly forming a website).

From now on you have at your disposal a number of tools that will provide you with fairly relevant data about the web pages you provide to the "web environment".

You will have access to performance details: graphs with the number of clicks "suffered" by web pages, coverage, relevance, experience, etc.), presented on the time axis to have a clear and definite picture of what our web pages represent for web space, etc., etc., etc.

I will not go into more detail now because all will "come" at the right time. Now let's focus on this "Step 1"!

Simply registering your site with these facilities gives useful information but also provides the possibility that the owners of websites in the web environment can send the structure of the site (so-called sitemaps) to facilitate content indexing.

With the "utilities" that are offered to webmasters, you will first have information about search performance, represented by:

- total number of clicks to view the content of the site (clicks),

- the number of "response" views of searches by web surfers that indicate links to your site, which appear in search engines (total impressions),

- the percentage indicating clicking on the links displayed by search engines (average CTR),

- a kind of "average" positioning in a top of the placement of the pages representing the response of the search engine (average position), of course indicating the position of the page belonging to you,

- 'crawl requests',

- crawl errors

- the number of pages indexed,

- the "key" words used by those looking for something and reached the search engine response to your site and the pages that were accessed following the search engine search response,

- other relevant information related to user searches, such as the countries of origin of the "searchers"m the devices from which the search was made, timeline elements regarding these hits, etc.

All this will give you clear and clear information about what your website represents in terms of users of the web space.

But although what you see would be your main objective of perception, these webmaster tools come with "technical" information related to:

- errors on web pages that stop accessing and indexing those pages, consequently their presence in the search engine response,

- the number of web pages that have been indexed but have errors,

- the number of web pages indexed,

- the number of web pages that have been indexed but which, for various reasons, are excluded from the answer "shown" by search engines,

- information about use on various search devices (desktop, mobile),

- information about security issues "raised" by web pages,

- other actual SEO reports.

So here are just a few advantages brought by the use of these setups, tools, etc. that can attract measures that can be taken by the webmaster to improve the presence on search engines and thus the "efficiency" of access to the site and its content.

Note that the vast majority of existing hosting forms at the moment offer you different SEO tools such as AWstat, visitors (and others) in the case of cpanel, SEO elements typical of CMS (Content Management System) platforms, etc.

So I will discuss in detail everything that can be done with these tools, so that we can improve the so-called "SEO Score" of the site where you invest your time, regardless of the rationale behind this effort.

Dorin M - 19 June 2020