Every SEO should master the method of learning SEO: positioning the site, demand analysis to choose good content, on-site optimization, off-site optimization, promotion, data analysis, experience analysis. SEO: Search Engine Optimization

SEO is abbreviated from The English Search Engine Optimization, Chinese translation for “Search Engine Optimization”. SEO is refers to the structure adjustment through on-site optimization such as website, website construction, web site code optimization and off-site optimization, such as off-site site promotion, website, brand construction and so on, satisfies site search engine ranking demand, improve the keyword ranking in the search engine, thus the precise users to the site, to get free traffic, a direct sales and brand promotion.

White Hat SEO

White hat SEO is a fair approach, is the use of SEO optimization methods in line with the guidelines of the mainstream search engines. It’s the opposite of black hat SEO. White hat SEO has been considered to be the best SEO practices in the industry, it is in the case of avoiding all risks to operate, but also to avoid any conflict with the search engine issued policy, it is also the highest professional ethics standards of SEOer practitioners.

Black Hat SEO

Generally speaking, all the use of cheating or suspicious means, can be called black hat SEO, such as spam links, hidden pages, bridge pages, keyword stacking and so on. The main feature of black hat SEO profit is short and fast. This kind of cheating method for search engine loopholes, will be punished at any time because of the change of search engine algorithm.

Grey Hat SEO

The so-called grey hat SEO grey hat, that is, refers to the middle area between the white hat and the black hat, relative to the white hat, will take some tricks to operate, these behaviors because not illegal, but also do not abide by the rules, is the gray area.

SEM: Search Engine Marketing

SEM is abbreviated from English Search Engine Marketing, which translates as “Search Engine Marketing” in Chinese. To put it simply, search engine marketing is network marketing based on search engine platform, which makes use of people’s dependence on and usage habits of search engine to deliver enterprise information to target users when people search for information. Let the user find the enterprise actively, and click the advertisement of the enterprise, and finally contact with the enterprise or place an order.

Pay-per-click advertising (PPC: Pay Per Click)

PPC is an acronym for Pay Per Click. Pay-per-click advertising is the most common form of online advertising used by large companies. There are a lot of websites offering pay-per-click, mainly major portal websites (such as Sohu, Sina) search engines (Google and Baidu), as well as other websites with large traffic.

Baidu Bid

Baidu bidding ranking is baidu’s domestic initiative of a pay-per-effect network promotion method, simple and convenient web operation can bring a large number of potential customers to the enterprise, effectively enhance the visibility and sales of the enterprise. Every day, more than 100 million people search for information on Baidu. After companies register keywords related to their products on Baidu, they are found by potential customers who actively search for these products.

Internet Advertising (DSP: Demand-Side Platform)

In the Internet advertising industry, DSP is a system as well as an online advertising platform. It serves advertisers, helps advertisers to carry out advertising on the Internet or mobile Internet, DSP can make advertisers more simple and convenient to follow the unified bidding and feedback way, located in a number of online advertising trading platform, real-time purchase of high-quality advertising inventory at a reasonable price.

* The demand side (DSP) puts forward the advertising demand, the trading platform (Ad Exchange) connects the supply platform (SSP), finds the precise user through the data management platform (DMP), and calculates the price that the demand side needs to pay for the advertising display (RTB).

Ad Exchange

Ad Exchange is an Internet advertising trading platform. Like a stock trading platform, Ad Exchange contacts the buyers and sellers of advertising transactions, that is, advertisers and Ad space owners.

Supplier Side Platform (SSP)

Supply-side platforms allow media owners to participate in AD transactions, making their inventory of ads available. Through this platform, media owners hope to get the highest effective cost per thousand impressions for their inventory ads without having to sell them at a low price.

Data Management Platform (DMP)

A data management platform can help all parties involved in buying and selling AD inventory manage their data, make it easier to use third-party data, enhance their understanding of all this data, send back data, or pass custom data into a platform for better targeting.

Real Time Bidding

RTB is a precise marketing method that is the king of technology. When a user browns a certain product or clicks on a special type of advertisement on the whole network, the browsing trace will be recorded through cookies. Through the advertising trading platform, the next time you browse the web, you will be pushed with the advertisement that meets your preference.

App Store Optimization ASO is short for App Store search Optimization. ASO(App Store Optimization) is the process of improving your App’s ranking in various App marketplaces and search results. SEO optimization similar to mobile apps. Generally speaking, location-based Service is to determine the geographical Location of mobile devices or users. The second is to provide all kinds of information services related to location. Such as LBS and O2O (Meituan). Users can query the business information around them through LBS service, and then follow the navigation to find the entity consumption point, which is an online program. Completing the purchase in the physical store is an offline application.

CPC: Cost Per Click

The cost per click of online advertising is the most common form of pricing in the field of online advertising.

Cost Per Sales (CPS: Cost Per Sales)

The advertising cost is calculated by the actual number of products sold. This kind of advertising is more suitable for shopping, shopping guide and website navigation websites, which need accurate traffic to bring transformation.

CPT: Cost Per Time

This method is characterized by charging according to the user’s usage time or usage cycle, which can fundamentally eliminate traffic brushing and activate cheating. It is one of the most real and effective marketing methods.

Cost Per event (CPA: Cost Per Action)

This pricing method refers to the actual effect of advertising, that is, according to the pre-set conversion goals to charge, but not unlimited advertising.

CPM: Cost Per Mille

Cost Per Thousand Impression The price an advertiser pays for 1,000 displays of his or her AD.

CPR: Cost Per Response

Charging for each response of the viewer fully embodies the characteristics of online advertising “timely response, direct interaction, accurate recording”, but this is only a formula to assist sales.

Average Revenue Per User (ARPU)

ARPU focuses on how much revenue a carrier gets per user over a period of time. Obviously, the more high-end users, the higher ARPU. In this period, from the point of view of the operator’s operation, high ARPU value means high profit and good benefit during this period.

Daily Active Users (DAU: Daily Active User) is usually measured by the number of users who log in or use a product on a Daily basis (excluding those who log in repeatedly) and is an important indicator of User engagement.

Return On Investment (ROI)

Return on Investment (ROI) = annual profit or average annual profit/total investment ×100%, which is usually used to evaluate the value of an enterprise to an activity. A high ROI indicates a high value of the project.

FullText Search Engine

Full-text search engine is the mainstream search engine widely used at present. Its working principle is that the computer index program through scanning every word in the article, to establish an index for each word, indicating the number and position of the word in the article, when the user queries, the retrieval program according to the index established in advance to search, and search results feedback to the user’s retrieval method. The most commonly used full-text search engines are Baidu, Google (Google). The corresponding search engine is directory index class.

Metasearch Engine and aggregative Search help users to select and use appropriate Search engines (or even use several Search engines at the same time) to achieve Search operations through a unified user interface, which is the global control mechanism of a variety of Search tools distributed in the network. China’s first metasearch engine was Bibi Cat, but it is now defunct. Some time ago, 360 had a comprehensive search, which was basically equivalent to metasearch, and the search results were displayed as a comprehensive page of all engine results of Baidu, 360, Google and so on. But I didn’t do it, probably for copyright reasons.

Directory search engines Although directory search engines have a search function, they are not really search engines in the strict sense, just a list of links to websites sorted by directory. Users can completely find the information they need according to the classification directory. Due to the addition of human intelligence, this kind of search engine has accurate information and high navigation quality, but its disadvantages are the need for human intervention, large maintenance, little information and information updating in a timely manner. If inchoate Yahoo, sohu. Now hao123 has similar functions, but it is not a directory search engine in the strictest sense. It is just a list of website links classified by directory.

Web Spiders

Web spider (also known as web crawler, web robot, search engine spider) is a program or script that automatically crawls information on the World Wide Web according to certain rules.

Spider traps

“Spider trap” is an obstacle to stop the spider program website, are often those who show web technology and methods, many current browser to consider these factors when designing, so may web interface looks very normal, but the spider trap will cause disorder to the spider program, if eliminate the spider traps, can make the spider program included more web pages.

Crawl frequency

Crawl frequency is the total number of crawls of the website server by the search engine in a unit of time (day level). If the crawl frequency of the site is too high, it is likely to cause server instability. Baiduspider will automatically adjust the crawl frequency according to the content update frequency and server pressure and other factors.

Query A search request, also known as a search Query, is a process in which a user attempts to obtain the returned results by typing a keyword into a search engine. The search request represents the search intent of the inquirer.

Index

Commonly known as “pretreatment”. The pages captured by spiders are broken down, analyzed, and stored in a database as giant tables, a process known as indexing. In the index database, the text content of the web page, the location of keywords, font, color, bold, italics and other relevant information are recorded.

Site index

How many pages are available as search candidates is the index of a site.

Site content pages need to be captured and filtered by search engines before they can be shown to users in search results. Indexing is the process by which pages are filtered through the system and used as search candidates.

At present, the value of site syntax is the estimated value of index quantity, which is not accurate. We recommend that webmasters use our new tools, and we are working on improving site syntax.

included

Refers to the search engine site index to their own database, the common search engine included Baidu included, Google included, search dog included, Youdao included, Yahoo included, kua search included, Ze Xu included. Users can submit websites to attract spiders to grab pages through search engine submission entrance, or attract search engines to visit website pages through external links, when the search engine thinks the page meets the inclusion criteria, it will be included in the website page.

Build a database to build an index for a new indexed web page. We are often divided into two cases when judging web pages included, the first is to search web links; The second is that we search the title of the page directly. When a search engine only includes a link to a web page and does not find a link in the title, we say “the page is not library”, which is when the search engine found your URL, but did not put it in the index. In this case, your search for any queries related to the page will not result in the page appearing in the search results, except for the search URL itself.

Invalid collection about invalid collection, Baidu webmaster @hanbelt is so said: first of all, the literal meaning, the so-called “invalid”, is the same as no, will not bring any search traffic. About, for example, “a child have a fever how to do” content, baidu index is 50 w a page in the Treasury, and the subject in a short period of time is not likely to have what fresh content, unless a sudden breakthroughs in science and technology, baidu think that the amount of the index page is enough to meet the needs of users, the index is more waste of resources, New indexed pages do not need to be placed in the index library, but in the underlying library.

Bottom vault what’s bottom vault? It’s a backup! Your goddess has 50W boyfriends, and you are the backup for 50W. How do you get it? When you’re dead for the better part of 50 miles, maybe you’ll get your chance. Those who can’t participate in the rankings are in the bottom library. Therefore, want information not to be put into the bottom library, one is to have enough influence, Baidu dare not ignore you, such as Sina NetEase, even if an advertisement, Baidu also obediently-indexed, as a formal boyfriend, dare not be a spare. The second is to have novel content, Baidu also love new and old, good things will not miss of course. * What if all collections are built into low-level libraries? How to improve?

Ranking

Enter a keyword in the search engine, usually get a lot of search results, these search results ranked in order, this is the search engine ranking.

Natural ranking

A ranking that appears on a search engine result page depending on the relevance and importance of the page itself. On a search engine result page, advertising or paid ranking usually has names such as corporate promotion and sponsored links, while natural ranking does not have these marks.

Web snapshot search engine in the collection of web pages, the backup of the web page, there is their own server cache, when the user in the search engine click “web snapshot” link, the search engine Spider system then captured and saved the web page content to show, known as “web snapshot”. * What does Baidu Snapshot update mean?

Page Strength

Website weight refers to the search engine to the website (including web pages) given a certain authority value, the evaluation of the website (including web pages) authority evaluation. The higher the weight of a website, the greater the weight of the search engine, the better the search engine ranking. There are several points to note: 1, the weight is not equal to the ranking 2, the weight of the ranking has a very big impact 3, the weight of the whole station is conducive to the ranking of the inside page.

* Misunderstanding: for example, people often discuss baidu weight is not actually the concept of Baidu official, but love station, webmaster tools and other websites launched for website keyword ranking is expected to bring traffic to the site, grade 0-10 third party website popularity evaluation data.

Right Down

The use of search engine strategy defects, malicious means to obtain rankings inconsistent with the quality of web pages, resulting in search results and user experience will be considered as cheating by the search engine, cheating should be in accordance with the principle of “light punishment, heavy punishment” : If it has little impact on user experience and search result quality, the weight obtained by cheating part shall be removed; if it has serious impact on user experience and search result quality, the weight obtained by cheating part shall be removed and the weight of the website shall be reduced until the search result is completely cleaned up.

K station is commonly known as “plucked hair”. The so-called Baidu K station is to block your website, your collection will be 0, from baidu search engine on the basis of the site can not find your trace, trace. K station can be said to be frozen to Baidu, the same as usually required to rule than force takes a long time to recover.

Alexa ranking

It refers to the site’s world ranking, mainly divided into comprehensive ranking and classification ranking, Alexa provides a comprehensive ranking, visit ranking, page view ranking and other evaluation indicators information, most people regard it as the current more authoritative site view evaluation indicators.

Site PR (Page Rank)

Page Rank is a search engine ranking of web pages based on how many links they have to each other (Rank 1 to 10). Named after Larry Page, the founder of Google.

*2014 Google officially announced that it was abandoning PR

robots.txt

The full name of Robots Protocol (also known as crawler Protocol, robot Protocol, etc.) is “Web Crawler Exclusion Protocol”, through which websites tell search engines which pages can and cannot be crawled.

A sitemap is also called a sitemap. There are two common formats, one is XML format, the other is HTML format. HTML is mostly static pages, which are used to show users and help them better search for the content they need on the website. The XML format is mostly used for submission to search engines, for search engines to crawl web pages.

The Sandbox effect

Newly published websites rank lower and lower on the search engine results page (search results) determined by Google, the dominant search engine. This phenomenon is known as the sandbox effect. Sites with rapidly increasing link popularity will be given a cold treatment to prevent excessive SEOer optimization. The content of the web page can show it to, but will be in a “sandbox”, and won’t get the highest ranking for any search requests its popularity over a period of time later still remains the same, or rise gradually, so the search engine began to cancel the cold treatment and higher weights to link popularity, search rankings.

SERP: Search Engine Results Page

A SERP is a list or result that is displayed for a particular search. A SERP is sometimes defined as a placement of search engine results. For the purposes of this series, I refer to this as a page rather than a placement.

SPAM: Stupid Person Advertising Method

Search engine spam is the use of unethical techniques to improve their search engine rankings. Dishonest webmasters use such tactics to cheat search engines to gain higher rankings. This will improve your site’s ranking in the short term, but the consequences can be very serious. It may cause a search engine to permanently delete your site from its database!

Baidu search engine algorithm Lvluo algorithm (online on February 19, 2013) this algorithm mainly fights the behavior of buying and selling links, including hyperlink intermediary, selling links, buying links and other hyperlink cheating. The introduction of the algorithm can effectively prevent malicious exchange of links, the release of external chain behavior, and effectively purify the Internet ecosystem. * Baidu lvluo algorithm online announcement

Pomegranate algorithm (May 17, 2013 online) pomegranate algorithm is mainly targeted at a large number of users to prevent the normal browsing of the bad advertising pages, baidu is aimed at the low quality of the website to further attack the upgraded version, and before baidu Lvluo algorithm should be relative. * Pomegranate algorithm – Low quality page terminator

Lvluo algorithm 2.0 (online on July 1, 2013) Lvluo algorithm 2.0 focuses on news stations that publish advertorials. The objects of punishment include: advertorial trading platform, advertorial publishing station, advertorial revenue station three categories; The punishment includes: 1. For advertorials trading platform, will be directly blocked; 2. For advertorial publishing stations, it will be processed according to different degrees. For example, a news website, there is a phenomenon of publishing advertorials but the circumstances are not serious, the website will be reduced in the search system evaluation; The use of subdomain to publish a large number of advertorials, the subdomain will be directly shielded, and clean up baidu news sources; Some even create large quantum domains for publishing advertorials, in which case the entire domain will be blocked. 3. In view of the advertorials benefit station, there is a small amount of advertorials outside the chain of a website, so at this time the chain will be filtered out of the weight calculation system, the benefited site will be observed for a period of time and further processing as the situation; A site outside the chain there is a large number of advertorials outside the chain, then at this time the benefit of the site will be reduced evaluation or direct shielding. * Green Flower algorithm 2.0 interpretation

Ice bucket algorithm (launched on August 30, 2014) Mobile terminal AD pop-up, forced download APP, login to read the full text and other behaviors, if they happen on mobile pages, they are the objects of punishment of the ice bucket algorithm. * Baidu mobile search ice bucket algorithm announcement

Ice Bucket algorithm 2.0 (online on November 18, 2014) This 2.0 upgrade mainly hits: full-screen download, large area advertisement to block the main content in the narrow mobile phone page, forcing users to log in to use and other problems. * Mobile Search ice Bucket algorithm 2.0 update announcement

Ice bucket algorithm 3.0 (July 15, 2016 online) severely cracked down on baidu mobile search, interrupting the user’s complete search path of the call up behavior. Popular explanation is: users through Baidu search, enter your page, if you want to view the main content, users need to perform other operations, such as closing several ads, need to share, and so on, before you give the main content. When you go to a website, the site either forces you to log in or force you to sign up in order to continue to see it. Combat this because it interferes with the user experience. * Baidu mobile search ice bucket algorithm upgrade notice

Original spark plan (May 15, 2013) algorithm content: crack down on copying and other behaviors, encourage original and high-quality content, launched for the first time with high-quality original ability of the website cooperation, such as the latest content from the first site, the first site priority ranking. Now the algorithm has been upgraded, can directly let the technology to do the active push function, if the original content, remember to mark. * Baidu’s original Spark plan