What are the problems with the sessions ID in URL?
The sessions are very useful, even essential, in many cases (an advertising tracking…). However, when their IDs are used in URLs, this causes a big problem.
For each new visit, a new session being started, a parameter containing the session ID is added to each URL. Obviously, the content of the website doesn't change.
As a result, we potentially obtain thousands of different URLs for the same content. For each visit, search engines will keep finding new URLs, the latter thus representing as many duplications.
More and more search engines that don’t want to fill their servers with identical data (the famous “duplicate content”) are cleaning up and wasting less and less time penetrating the architecture of your website if it is not “Google-friendly”.
One of the golden rules to remember is the following one:
One content = one (indexed) URL
There are many solutions to avoid this problem:
- Don't use the URL to memorize the sessions, use cookies (not available for robots) instead.
- Don't create a session for search engines.
- Open a session only when your visitor (on their private space, non-SEO) is connecting and not since the first display.
- Change the behavior of the URLs of your website if a robot has been detected (cloaking).
- Under IIS servers (Microsoft), the problem can sometimes be fixed directly by configuring it However, be careful! Some tests are beforehand required.