List of actual combat projects:

  • WeRoBot framework to develop automatic reply wechat robot
  • Tornado Development Background
  • MongoDB does the data storage
  • Scrapyd deploys the crawler
  • Scrapy writes a crawler
  • Deploy it all on ali Cloud server
  • Write it in Python and mix it all up, and you can get the daily news via wechat

No more nonsense, first picture, see the effect!




Webpage ugly, please forgive me, I am not specialized in writing front-end, front-end knowledge I will make up in the future. Why choose the computer side access, because the mobile phone access may still have some restrictions, but the computer side access completely no problem ah! How good is that? Cow not cow?

Why don’t we pay attention to a wave of peake’s poops? I have done this dry goods on, we do not help me to promote the share? Let more people enjoy the fun of learning.

The development train of thought

Now that we see the results, let’s take a look at how this process works.

First of all, let’s daydream about how I can access the website through wechat official account.



Here’s the answer:We need a wechat automatic reply robot, because in this way, I do not need to log in the webpage every day, copy and paste writing materials, and use the computer instead of my work, so I am much easier.


This is how WeRoBot works.

Next, if we have an auto-reply robot, what can we look at?



Here’s the answer:We need to set up a remote server, which needs to run a set of web services. Give us a web page with the information we want.


This is what happens with Alibaba’s Cloud server (and Tencent’s) and Tornado.

And then, we have to figure out, where is the data on my website coming from?



Here’s the answer:The data read on the web page should be read from the database.


This is the MongoDB usage scenario.

We know where the data is being read from, so where is the data coming from?



Here’s the answer:The data is the source, and we can crawl it down.


So here’s the Scray scenario.

That’s pretty much it. With some glue in the middle, we can deploy crawlers to remote servers using Scrapyd and a timer to execute crawlers. So we’ll have a constant stream of new data. On the web, we just need to show the data of the day. Coming down like this,

Everything’s perfect except a programmer, right?

We chose To do things in Python, because this thing is really good for doing things.

Now that we’ve got the whole logic figured out, let’s do it backwards.

A few previous posts:

This is how you’re going to get to the top of your game and get to the top of your game. This is how you’re going to get to the top of your game and get to the top of your game and get to the bottom of your game. Use Scrapyd to deploy crawlers step by step on Tencent cloud

I’ve talked about how to write a crawler, and how to deploy the crawler on the server, are very detailed, and there are hand by hand tutorials, is very conscience. I’m going to skip over this.

Next, we used MongoDB, the installation process of which I made clear in the last article:

Install MongoDB on Ali Cloud server and realize remote visual connection

If there is anything not, you can give me a message, I will help you answer.

That leaves WeRoBot and Tornado. I’m going to show you how to do this in more detail in the next article. In fact, there are many pit here, but I have given everyone stepped on. It’s OK to follow my steps step by step.

Overall review

The crawlers on the remote server are Scrapy bugs.

With Tornado’s timed execution function, it is performed every hour.

The climbing data is stored in MongoDB.

Tornado’s GET method for path pairs reads data from MongoDB and passes it into the already written HTML template.

WeRoBot relies on Tornado deployment and is embedded in Tornado service. Specific instructions sent through wechat to make a return result and return to the client.

So much for the above content, I hope you can enjoy it. Better all alone than all. If you like the article, I hope you can tell more friends around, join us, to experience more fun.

Follow “Pique’s pooper” and reply to “Daily” to find what you really want from the bottom of your heart.

So hard goods of the public number, you do not hurry to pay attention to a wave ah?