Images are compressed as much as possible
css sprite

2. Separation of structure, performance and behavior. Another major cause of slowing down web pages is stacking CSS and JS on HTML pages, and it annoys me every time SOMEONE writes CSS and JS directly on a page. Through the way of external chain can greatly speed up the loading of web pages, CSS files can be placed in the head, JS files can be placed at the bottom of the body, in the case of no impact on reading to load JS files.

3. Optimize the hierarchical structure of websites. It is necessary to add breadcrumbs to each inner page to help spiders avoid getting lost after entering the page. If possible, it is best to add a separate Sitemap page to show the structure of the site at a glance in front of spiders, which is more conducive to spiders grabbing information.

4, centralized site weight. Since spiders assign a certain amount of weight to each page, and these weights will be equally distributed to each A link, to centralize the site weight, use the “rel= Nofollow” attribute, which tells spiders that they don’t need to grab the target page and can distribute the weight to other links.

5. Text emphasizes the use of tags. When a keyword is highlighted in bold, the strong tag is more emphatic than the B tag.

6. Use of the title attribute of a tag. In the case of not affecting the function of the page, you can try to add the title attribute to the A tag, which can be more conducive to the spider to grab information.

7. The use of Alt attributes for images. This property can be used to display the relevant text information on the page if the image does not load, as above.

8. Use of H label. A page can have at most one H1 tag, which can be placed above the most important title on the page. For example, an H1 tag can be placed on the logo of the home page.

9, image size declaration. If the image size is not defined, the page needs to be re-rendered, which will affect the loading speed. 10 Adjust the page layout. Page content as far as possible do not make flash, video, these things spiders are not able to catch, even if necessary, also want to generate the corresponding static page.

10, the website structure is flat tree, directory structure should not be too deep. Each page from the home page click no more than 3 times, too deep is not conducive to the search engine crawl.