LinkWithin
Thursday, June 14, 2012
Friday, February 4, 2011
Google 2011 algorithms - Google's scraper now live not Content Farm
Hello every one i found so many websites disappear in google search result, some sites are down in search search result.
So dont worry about this issue last 2..3 days i was researching the solutions for survive with Google's scraper and also take action for next content farm algorithm.
We all knows very well Google tries to crack down on 'web spam'
As per my research i found some tips and technique for our site considers as high quality site.
We have option to how to find low quality sites.Best Option is Webmaster Tool.
How to find low qualities in our site.
Open or sign in webmaster tool
Go in Site Configuration:
- Update all new pages in sitemap.
- Important tips : URL parameters - If your site uses URL parameters, some of the parameters may be unnecessary for page navigation. Asking Google to ignore these parameters can reduce duplicate content in Google's index and make the site more crawlable.
Go in your site on a web:
- Look search queries :
- if you found unrelenting search queries that means your site is low quality site. so the solution is remove this type of content on your site.
- check Top pages: in search queries it helps you to improve priority for your top search pages. if unrelavant pages are search more so try to improve relavant pages so that pages search more.
Links to your site:
here you can find out best sites as well as low quality sites which are linked with you . using this we can easily find the low quality sites which are not crawling for long times.
Keywords:
the most common keywords Google found when crawling your site. These should reflect the subject matter of your site. if you found unrelavant keywords then remove from your sites.
The Most Important Tips are check your :
Diagnostics
Malware : check all malwares. and resolve it.
Crawl errors: if we solve all Crawl errors. it will affect in low quality sites.HTML suggestions:
if we solve all suggestions which are given in html suggestion we will go in high quality sites list.
Site Performance Overview:
if our site load time is high than our site is also considers as low quality site.
Some more hints:
- Google hopes to include reversed 404 pages sooner.
- The AdWords titles may now include the first line description,
- Google search results may now include hotpot content, such as reviews and photos.
- Google admitted they have an issue with owner verified replies on place pages.
Off Page suggestions for Google Scraper:
Use sufficient amount of original text content, supported by images, videos and other multimedia as appropriate. are rapidly indexed by the search engines.
submit new pages to the engines via XML sitemaps and featured on the homepage or another highly authoritative hub pages in the respective site (such as a category homepage) until they have been indexed.
If the site has a blog, make sure it pings the search engines when a new post is published (most do), and then use the blog to publish or link to new content on the site.
consideration should be given regularly as to why someone/a third party would naturally link to the site’s pages or share them on Twitter, for example. If a firm/site cannot think of a good reason, it may need to go back to the drawing board.
above all suggestions i have studied in so many videos, blogs, news.
Tuesday, June 8, 2010
New Google Features Create SEO Opportunities - sideways query, Google Squared, PageRank
Google’s Matt Cutts always offers helpful advice, and our conversation with him at Google I/O was no exception. Cutts catches us up on a variety of search items including Google Squared, PageRank, and the recent redesign to Google’s search results page.
Google Squared is a new tool that puts search results into a spreadsheet-like list. It essentially organizes the results into facts, so users don’t have to click on multiples sites to find what they need. Cutts refers to it as a “sideways query” and points out that it could provide new information for users that they would not have previously found using traditional search.
When we spoke with Cutts earlier this year, he mentioned the growing obsession that SEOs and webmasters have with PageRank. We asked him about it in the above video, and while he did say it was important, he was quick to point out that it was only one of the more than 200 signals Google takes into consideration. He says content, title, url, and proximity are a few of the factors that have additional influence.
Users have also probably noticed the new redesign to the search results page. Cutts says the left-handed navigation was present for a while before the company decided to surface it for all search results.
Interestingly enough, the options are different based on each query. A search for Tom Cruise, for example, would probably return image results in the navigation. On the other hand, a search for President Obama would return real-time results and updates. Cutts says it creates more opportunities for webmasters and SEOs.
Lastly, Cutts did say that Caffeine was coming along nicely and indicated that there would be some announcements regarding it coming soon. Keep watching for all the latest details on it Visit : WebProNews.com
Google Squared is a new tool that puts search results into a spreadsheet-like list. It essentially organizes the results into facts, so users don’t have to click on multiples sites to find what they need. Cutts refers to it as a “sideways query” and points out that it could provide new information for users that they would not have previously found using traditional search.
When we spoke with Cutts earlier this year, he mentioned the growing obsession that SEOs and webmasters have with PageRank. We asked him about it in the above video, and while he did say it was important, he was quick to point out that it was only one of the more than 200 signals Google takes into consideration. He says content, title, url, and proximity are a few of the factors that have additional influence.
Users have also probably noticed the new redesign to the search results page. Cutts says the left-handed navigation was present for a while before the company decided to surface it for all search results.
Interestingly enough, the options are different based on each query. A search for Tom Cruise, for example, would probably return image results in the navigation. On the other hand, a search for President Obama would return real-time results and updates. Cutts says it creates more opportunities for webmasters and SEOs.
Lastly, Cutts did say that Caffeine was coming along nicely and indicated that there would be some announcements regarding it coming soon. Keep watching for all the latest details on it Visit : WebProNews.com
Thursday, May 27, 2010
Google Confirms “Mayday” Update Impacts Long Tail Traffic, Google Algo Update 2010, Googler Matt Cutts
Google Confirms “Mayday” Update
Impacts Long Tail Traffic
Google Algo Update 2010 Google Matt Cutts
Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic.However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A, ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.”
I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic.
This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages).
My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals.
What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget.
Source: http://searchengineland.com
Subscribe to:
Posts (Atom)