From last two months google is playing a sick game with me, they first remove my two pages from google which ranked for #1 last month by saying blocked was robots.txt and there was no case like it at all. Somehow, this month I managed to create new pages and rank them on #2 and #3 but today they removed my paged again and I checked everything and there is no issue at all this time. They instead replaced #2 page with my another page. I mean what google is doing…
Monitor website uptime. If robots.txt return 200, 30X or 404 it’s okay but due to downtime if it returns 5XX or no response your site can get de-index.
Avoid targeting similar keywords for two pages on a single domain. This kind of spamming can create problem.
Technical audit of log and on page SEO can help.
I mean all things are perfect, they first remove page than add it two days after and then again remove it.