Recently, I am doing SEO for the website, and the front end is developed with VUe. js (not SPA). I have seen some posts on the Internet, and the general direction is Prerender/SSR.

SSR can undoubtedly solve the problem, but it feels a little heavy and costs a little higher for reconstruction. Therefore, the current research is mainly focused on Prerender.

This diagram is my current general idea: regularly pre-render all pages that need SEO and return the rendered HTML directly to the user or crawler when they visit. Many Prerender programs on the web do pre-render only for crawlers, but I think there are two advantages to doing the same for users: 1. There is no need to worry about the risk that the content returned to the search engine is inconsistent with the content returned to the user and will be regarded as cheating by the search engine (Google documentation states that if the content is only visible to the search engine but not to the user, it will be regarded as fraud); 2. Pre-rendering for users is equivalent to a layer of caching, which greatly reduces the pressure on the server (similar to indirectly realizing squid/Varnish).

The obvious deficiency of this scheme is the poor real-time performance of data update. However, the content update frequency of our website is low, and the pre-rendering of specific pages can be manually triggered even if sensitive information is changed.

Although this scheme looks nice, there is a thorny problem: the JS interaction effect of the page after pre-rendering is gone! The raw HTML packaged with Webpack contains script tags that reference Vendor.js/manifest.js /

.js, but even if the pre-rendered page retains these references, it still cannot trigger various events.

Is there any good way to solve this problem effectively? Kindly give advice ~