Google’s John Mueller answered a question about why Google indexes pages that are disallowed from crawling by robots.txt and why the it’s safe to ignore the related Search Console reports about those ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results