What are the obstacles causing indexing issues to a website?
The list of factors blocking or penalizing indexing is quite long. With the improvement of development practices, the increasing quality of popular CMS and the progress of search engines, it is becoming "harder" not to be indexed.
Some technologies are true disasters if you want to get an excellent indexing.
Websites or browsing path of websites entirely and only thought in Javascript present also frequently problems of interpretation.
A site with no links pointing to it will take longer to be indexed, especially if it no longer receives any visitors.
It is also more complicated to index websites using frames or I-frames.
Massive duplicate content can be an issue to maintain robots confidence. If the majority of the content they find is a content that already exists, they would probably prefer to use their resources (machine time) somewhere else.
Truly exotic URLs are also more difficult to index even though most engines are less restrictive than before. Did you know that a few years ago, indexing a URL containing more than 3 hyphens on Microsoft's engine was more complicated or even impossible?
Very deep pages with a low internal linking don't prompt search engines to index them quickly or at least, to crawl them regularly.