This site has been archived and you can no longer log in or post new messages. For up-to-date community resources please visit

eZ Community » Forums » General » SEO on multiple site installation

SEO on multiple site installation

SEO on multiple site installation

Monday 11 November 2013 11:20:27 am - 4 replies

We have 11 sites on our EZ 4.7 installation. This is causing problems with Google Indexing.

Google Webmaster Tools for Site1 (our biggest site) showing us over 1000 pages not found in crawling errors. This is because it try to find For some reason Google is indexing site-2 pages, site-3, site-4, (...) as site-1 pages and the other way around.

I found this forum topic We have always had separated anon users for each sites as suggested in the forum reply, but this is not making any difference. 

We also have a problem with Google indexing, but also giving us a lot of duplicated titles and meta description.

Our indexing is decreasing more and more each day. Falling from 850 000 to 435 000 on page-1 since June. 

Any suggestions to this topic or thoughts on what causing this to happen?

Modified on Monday 11 November 2013 11:22:52 am by Jeanette S.

Tuesday 12 November 2013 4:17:46 am

Can you provide the actual site URL so that the HTML can be interrogated?  

Basically, Google will only follow links you provide, and cross-reference those with any meta-tag info you provide, via the rel="canonical" link tag and the rel="alternate" hreflang="" tag, as well as any robots.txt info.

Google dislikes duplicate content, so if you do have duplicate content (e.g with multiple locations of content), you should use the canonical link tag to point Google to the main location.

A drop in indexing of pages isn't necessarily bad - it could just indicate Google is just pruning where it sees duplication.

About rel="canonical"

Tuesday 12 November 2013 9:06:29 am

Hi! Thanks for coming back to me.

Yes, but Google have to misunderstand our structure since url's form page-2 is mixed with page-1 and so on.. Is it somewhere in the htaccess file or robot.txt file I can try to "explain" this to google?

We have not provided google with sitemaps. Can sitemaps make this better? I think we need to have separate sitemaps with one indexing them. Is there an extensions that could do that for us? I dont think bcgooglesitemaps will do..

I think you are right about the cononical link tag. So I will look further into that.

Tuesday 12 November 2013 9:50:56 am

In the crawling errors in Webmaster Tools, you can usually see where the links come from - are you able to access that info, and see if your sites are generating these links?

Tuesday 12 November 2013 11:32:20 am

Yes, I am.. Thanks for making me go deeper into that.

I think we have several problems and it's our own mistakes.

1. We have a lot of object with multiple locations. Sometimes author from site1 "steals" an article from site3 if they think it's relevant for their site. At the bottom of each article we have a "In Category"-section with link to the article's parent. If the article has multiple locations it will display the category from all sites. So, we have site1-cat :: site2-cat :: site3-cat and the links become :: and This of course causes crawl errors. 

2. We use tags and display the articles in We have not notice that tags from other sites in our system also is displaying here. Giving us urls like site1domain/site2domain/category/article.

3. Google also crawl the layout/set/print giving us even more errors. We need to fix our robots.txt.


But we also need to fix the problem with Google indexing, but also and the duplications. I think this have to be done in htaccess or something, because I can't find a way to do this with dynamically adding canonical links through $module_result or the fetch( 'content', 'node', hash( 'node_id', module_params()[parameters][NodeID] ) ).


You must be logged in to post messages in this topic!

36 542 Users on board!

Forums menu

Proudly Developed with from