Сase of audit for Macys

  • Author

    Oleg Vereitin

  • Published


  • Reading duration


In this case we show only 6% of the technical audit that we do

We analyzed the technical part of the site to understand what problems are present and to prevent the site from increasing the flow of potential customers. In this audit, we touched upon the following analysis parameters:

1. Checking server
1.1 Search for pages with 3** response code.
1.2 Search for pages with 4** response code.
1.3 Design of the 404 page
1.4 Checking analytics on a 404 page
1.5 Search pages No Response
2. Pages in index
3. Robots.txt
4. Sitemap.xml
5. Technical duplicates
5.1 Checking the availability of http and https
5.2 Search for duplicates of the main pages
5.2.1 Duplicates in tags and meta tags0
5.2.2 Pages without h1 found on the site
6. Breadcrumbs
7. External links
8. Site’s IP check
9. Checking the site for viruses, malicious scripts and the presence of the site in blacklists
10. Check for presence and errors in the SSL certificate
11. Broken backlinks

1. Checking server

Links with 301 redirects were found on the site. It’s recommended to replace them with the final links of the site to speed up the indexing of the site. More than 1,048,576 pages.

1.2 Search for pages with 4** response code

Pages with a 404 server response were found on the site. These URLs have to be removed from the site or changed to the relevant pages URLs. More than 80,000 pages were found.

1.3 Design of the 404 page

The visual design of the 404 page is implemented on the site.At the moment, the 404 page looks awful. Many people visit 404 pages on site so it’s necessary to make them continue searching for products on your site. The best option is to implement design like competitors:


1.4 Search pages No Response

Only external links with No response code were found on the site. These URLs have to be removed from the site or changed to the relevant pages URLs. More than 13,125 pages were found

1.5 Searching for pages with a 5** response code

Pages with a 5** server responses were not found on the site.

2. Pages in index

Technical pages were found in index:

And other pages were found in the index. We recommend closing pages from the index with the help of x-robots-tag or meta robots to delete and to prevent pages from indexing.

3. Robots.txt

Robots.txt is available at https://www.macys.com/robots.txt


Such Allow / Disallow directives are incorrect. Closing directories and pages via robots.txt has an advisory character for Googlebot. Googlebot bypasses such prohibitions anyway. Moreover, such pages do not move out of the index. It is necessary to generate the correct sitemap.xml and bring Robots to this form for the main domain:

User-agent: *
Host: https://www.macys.com/
Sitemap: https://www.macys.com/navapp/dyn_img/sitemap/mcom_sitemapindex.xml

All the pages you do not need to be indexed have to be closed from indexing via meta robots or x-robots.

4. Sitemap.xml

Current sitemap https://www.macys.com/navapp/dyn_img/sitemap/mcom_sitemapindex.xml

Sitemap must be regenerated based on the following requirements:

1) The sitemap must not contain URLs giving headings: 4xx, 3xx, 5xx.
2) The sitemap must not contain URLs that are blocked from indexing by the robots.txt file and the meta tag,
3) Suggestions for tags in the sitemap:
3.1) Google reads the values ​​in the tag as long as they are specified without distorting the facts.
3.2) The values ​​in the tags are ignored, so they don’t need to be added.
4) Sitemap file can contain no more than 50,000 URLs, and its uncompressed size should not exceed 50 MB. If the size of the Sitemap file exceeds the allowable size, you need to split it into several parts.
5) Use the same syntax when specifying a URL. Google will crawl them exactly according to the list. For example, if your site is hosted at https://www.example.com/, don’t use the URL https://example.com/ (without www) or ./moyastranitsa.html (relative URL).
6) Do not include session identifiers in the URL, this can lead to excessive page crawling.
7) The Sitemap file must define the following XML namespace: xmlns=»http://www.sitemaps.org/schemas/sitemap/0.9″.
8) The URLs in the Sitemap file must be specified in UTF-8 encoding, and its encoding must be understandable by the web server hosting this file.
9) Sitemap can only describe pages of the domain where it is located. Pages of subdomains or other domains cannot be described.
10) Accessing the file, the server should return a 200 response code.
11) The sitemap should be automatically updated when pages are added or removed from the site.

5. Duplicates in tags and meta tags

5.1 Search for duplicates in Title

Pages with duplicate Titles were found on the site (see all audit)

Duplicated Titles appears cause of couple reasons: Other filters by brand make additions to the Title, but this one is an empty category, so probably because of that it has no products.

5.2 Search for H1 duplicates

5.2.1 On H1 duplicates were found on the site

The solution is presented in the table (see all audit). Would recommend adding the primary category name to H1 to get rid of duplicates.

5.2.2 Pages without H1 found on the site

There were found pages without H1. Its more 8 pages, included main page Solution: We recommend changing h2 in the screenshot below to h1. Make the same for the first H2 on each page. It is recommended to change first h2 to h1.

6. Breadcrumbs

On the current site, breadcrumbs are currently implemented.

7. External links

External links found on the site. So, from almost all pages of the site there are external links to social networks and to similar products on other sites.All specified links in the document must be closed with the rel=»nofollow» 211,169 pages were found. All new external links should contain rel=»nofollow».

8. Site’s IP check

1.All sites adjacent to the IP address are in order. This check is done so that porn sites, illegal sites, gambling and other malicious sites are not located on the same IP with you, which would negatively affect the reputation of the IP.

2.The site’s IP address is located in the Netherlands.

9. Checking the site for viruses, malicious scripts and the presence of the site in blacklists. We checked the site for viruses so that Googlebot would not potentially ban the site for viruses. No problems found, check result:


10. Check for presence and errors in the SSL certificate

The SSL certificate has an “A” rating, which is one of the best indicator for a website. https://www.ssllabs.com/ssltest/analyze.html?d=www.macys.com. Follow the recommendations in the links above to get A+ SSL certificate.

11. Broken backlinks

Broken site backlinks have been checked. Found 188 thousand broken links to the site. It is recommended to restore or redirect pages that were on the site before in order to get links back.

This is only 6% of the technical audit of the case. To find out what is included in a full audit and price, click here

Whant to see our audits and find out the cost for you?

Make an appointment with us at a convenient day and time for you and we will demonstrate our solution and cost, right now!