Сase of audit for Ford

  • Author

    Oleg Vereitin

  • Published

    07/16/2022

  • Reading duration

    8

In this case we show only 7% of the technical audit that we do for Ford company.

We analyzed the technical part of the site to understand what problems are present and to prevent the site from increasing the flow of potential customers. In this audit, we touched upon the following analysis parameters:

1. Checking server
1.1 Search for pages with 3** response code
1.2 Search for pages with 4** response code
1.3 Search pages No Response
1.4 Searching for pages with a 5** response code
2. Robots.txt
3. Sitemap.xml
4. Technical duplicates
4.1 Search for duplicates of the main pages
4.2 Pages with GET parameters
4.3 Duplicate pages with multiple slashes ///
5. Duplicates in tags and meta tags
5.1 Search for duplicates in Title
5.2 Search for H1 duplicates
5.2.1 On H1 duplicates were found on the site
5.2.2 Pages without H1 found on the site
6. External links
7. Broken backlinks

1. Checking server

1.1 Search for pages with 3** response code.

Links with 301 redirects were found on the site. These are 1,048,576 not valid pages. Recommended to replace them with the final links of the site to speed up the indexing of the site.

1.2 Search for pages with 4** response code.

Pages with a 404 server response were found on the site. These are 133 not valid pages. These URLs have to be removed from the site or changed to the relevant pages URLs.

1.3 Search pages No Response

Only external links with No response code were found on the site. These are 4 not valid pages. These URLs have to be removed from the site or changed to the relevant pages URLs.

1.4 Searching for pages with a 5** response code

Pages with a 5** server responses were found on the site. These are 133 not valid pages. These URLs have to be removed from the site or changed to the relevant pages URLs.

2. Robots.txt

Robots.txt is available at https://www.ford.com/robots.txt

Such Allow / Disallow directives are incorrect. Closing directories and pages via robots.txt has an advisory character for Googlebot. Googlebot bypasses such prohibitions anyway. Moreover, such pages do not move out of the index. It is necessary to generate the correct sitemap.xml and bring Robots to this form for the main domain:

User-agent: *
Host: https://www.ford.com/
Sitemap: https://www.ford.com/sitemap.xml

All the pages, you do not need to index, have to be closed from indexing via meta robots or x-robots.  Also robots.txt contains 2 broken links for sitemaps:

Sitemap: https://www.ford.com/href-sitemap-en-us.xml.
Sitemap: https://www.ford.com/href-sitemap-en-us.xml

3. Sitemap.xml

Current sitemap https://www.ford.com/sitemap.xml

Sitemap must be regenerated based on the following requirements:
1) The sitemap must not contain URLs giving headings: 4xx, 3xx, 5xx.
2) The sitemap must not contain URLs that are blocked from indexing by the robots.txt file and the meta tag,
3) Suggestions for tags in the sitemap:
3.1) Google reads the values ​​in the tag as long as they are specified without distorting the facts.
3.2) The values ​​in the tags are ignored, so they don’t need to be added.
4) Sitemap file can contain no more than 50,000 URLs, and its uncompressed size should not exceed 50 MB. If the size of the Sitemap file exceeds the allowable size, you need to split it into several parts.
5) Use the same syntax when specifying a URL. Google will crawl them exactly according to the list. For example, if your site is hosted at https://www.example.com/, don’t use the URL https://example.com/ (without www) or ./moyastranitsa.html (relative URL).
6) Do not include session identifiers in the URL, this can lead to excessive page crawling.
7) The Sitemap file must define the following XML namespace: xmlns=»http://www.sitemaps.org/schemas/sitemap/0.9″.
8) The URLs in the Sitemap file must be specified in UTF-8 encoding, and its encoding must be understandable by the web server hosting this file.
9) Sitemap can only describe pages of the domain where it is located. Pages of subdomains or other domains cannot be described.
10) Accessing the file, the server should return a 200 response code.
11) The sitemap should be automatically updated when pages are added or removed from the site.

4. Technical duplicates

4.1 Search for duplicates of the main pages

Duplicates of the main page were found. Examples:

https://www.ford.com////
https://www.ford.com/index/
https://www.ford.com/?123123123123

Solution: you need to implement a self referential rel canonical tag on the page https://www.ford.com/ so it will eliminate all the duplicates.

4.2 Pages with GET parameters

Duplicate pages with GET parameters were found. Examples:

https://www.ford.com/?123123123123
https://www.ford.com/support/vehicle/bronco-sport/2021/how-to-videos/video-library/more-vehicle-topics/6268479066001?name=cleaning-your-wipers—bronco-sport

Solution: you need to implement a self referential rel canonical tag on the pages without get parameters, and it will eliminate all the duplicates.

4.3 Duplicate pages with multiple slashes ///

Duplicate pages with multiple slashes found on the site. Example: https://www.ford.com////

Solution: you need to implement a self referential rel canonical tag on the pages with one slash, and it will eliminate all the duplicates.

5. Duplicates in tags and meta tags

5.1 Search for duplicates in Title

Pages with duplicate Titles were found on the site. These are 91 not valid pages. Duplicated Titles appears cause of couple reasons:

Duplicate pages. They will disappear after implementation of recommendations from the P.5 Subpages for the different models of cars have the same title. You need to add the car model in the page title for this page.

Examples:

https://www.ford.com/support/vehicle/ranger/2022/how-to-videos/video-library/keys-and-locks/6284432964001?name=setting-climate-controls-during-remote-start
https://www.ford.com/support/vehicle/escape/2022/how-to-videos/video-library/more-vehicle-topics/6305723913112?name=setting-climate-controls-during-remote-start/

Same pages have different categories at the URLs so they are simple duplicates. You can put rel canonical for the primary page and leave it in the different categories.
Examples:

https://www.ford.com/support/how-tos/electric-vehicles/home-charging/how-do-i-open-the-front-luggage-compartment-on-my-mustang-mach-e-without-vehicle-power/
https://www.ford.com/support/how-tos/more-vehicle-topics/batteries/how-do-i-open-the-front-luggage-compartment-on-my-mustang-mach-e-without-vehicle-power/

5.2 Search for H1 duplicates

5.2.1 On H1 duplicates were found on the site

The solution is presented in the table (see all audit). These are 525 not valid pages. Duplicated H1 appears for the same reasons as Title duplicates.

5.2.2 Pages without H1 found on the site

There were found pages without H1. These are 770 not valid pages. Pages don’t have h1 before full load of the page. After full load pages have H1 but google bot could miss it.

6. External links

External links found on the site. So, from almost all pages of the site there are external links to social networks and to similar products on other sites. All specified links in the document must be closed with the rel=»nofollow» attribute. These are 756649 not valid links.

All new external links should contain rel=»nofollow».

7. Broken backlinks

Broken site backlinks have been checked. Found 188 thousand broken links to the site. It is recommended to restore or redirect pages that were on the site before in order to get links back.

This is only 7% of the technical audit of the case. To find out what is included in a full audit and price, click here

Whant to see our audits and find out the cost for you?

Make an appointment with us at a convenient day and time for you and we will demonstrate our solution and cost, right now!