Read this whole, if you want to understand.
One day , you make a website and then another day, google crawls your website.
Now SE know , there is a site, your's site.
Next day, it again, crawls if it founds new content. Then again, it crawls next day, again new content. Then next day, again crawling and it founds new content.
This trick keeps the bot crawling your website on a regular basis. That is because, it hopes for so, and also it founds so - new content. As you are uploading content on regular basis, The bot is encouraged to crawl your website, because it need content ,so that it can be judged and presented behalf the user for best result.
Now,if you do not upload new content, vice-versa, crawling will be very low.
Crawler just crawls and if found new content, sends it for indexing. That means, you do not need to upload new content, everyday, even if you upload new content, the code (which i can not remember right now) is received by crawler ans it thus crawls your website over other. because that code (the protocol) make the crawler believe that new content has been uploaded, whereas the content was only updated.
(The main things is changed have been made)
I am saying all these.....
because, backlinks are going to get indexed when the crawler indexed the site they are on. The more the update and clearer the path, the faster the content aka link get'scrawled.
Index that post in Search Console....Let's see what happens...