Tag Archive for: Google

From the Wordpress SEO book

Book Excerpt: What Are Authority Links?

The following is an excerpt (with some recent modifications and editorial comments) from our book, WordPress Search Engine Optimization (now in second edition!). You can buy the book at Amazon.

Authority Links: What They Are and Why You Want Them

There is a measure of power that some links possess that is independent of PageRank and it is the principle of authority links. Authority links are links from websites that have established a substantial degree of trust and authority with search engines as a result of their age, quality, and size. Authority is a somewhat subjective concept. Unlike PageRank, neither Google nor the other search engines offer any public reference or guidelines as to what constitutes an authority site or authority link. Authority sites are going to be the market leading sites, sites representing established government and educational institutions, large corporations, or leading websites. Authority links can bring tremendous ranking power to a website if one is lucky enough to obtain one or more.

Authority links are the golden eggs of link building. They tend to be extremely difficult links to get, and for that reason most webmasters rarely get them. The best approach to authority links is to be vigilant for opportunities to obtain them, but it is most likely fruitless to waste time seeking them out.

Our discussion of PageRank and authority links leads naturally to the notion of the relative power of inbound links. No two links are the same in terms of power. The degree of authority of a site, the PageRank of the page upon which the link appears, and the number of outbound links on the page where your link appears will all effect the relative value of the links you obtain. That said, almost all links are worthwhile, even lower value links. With what we’ve learned in the previous few pages, you will have a strong sense of how to evaluate link opportunities and to evaluate the relative strength of links.

Sometimes, you’ll be forced to settle for lower value links but in higher volumes, as is the case with link directories. But never fall into the trap of thinking that the only links worth getting are high-authority, high-PageRank links. All links are good for your rankings (except links from link farms and content farms, from which you should never seek out links).

Link Anchor Text

A vital concept in link building is link anchor text. Link anchor text is the word or words that constitute the visible text of the link itself, the “blue underlined text” as it is often called. The anchor text of a link is a powerful ranking factor; anchor text serves as a signpost to Google as to the content and subject of the destination page.

How Anchor Text Appears in HTML Code

The anchor text of a link is coded by placing the desired text between the open and closing markup of the hyperlink:

<a href="https://tastyplacement.com/">This Is Anchor Text</a>

Controlling the link anchor text of inbound links is vital whenever possible. The problem is that you can’t always control the anchor text of inbound links. And unfortunately, the higher quality the link, the more restricted you’ll be in choosing anchor text. A perfect example is the Yahoo Directory. A link in the Yahoo Directory is a great link to get, but Yahoo dictates that the anchor text you select be the name of your website or the name of your business. Yahoo does not allow you to stuff keywords into the anchor text. Here lies another good reason to choose a keyword-rich domain name for your website and business. When your business name is carefully crafted to comprise keywords, like “Austin Air Conditioning,” then you can employ those high-volume keywords more easily in your link building efforts.

To continue an example from an earlier chapter, if you have identified the phrases “Jacksonville air conditioning,” “Jacksonville air conditioning contractors,” “Jacksonville air conditioning companies,” and “Jacksonville air conditioning repair,” as the keywords around which a specific page is built, then your anchor text selection is nearly complete. You can use the same keywords as your desired anchor text.

When you can control the anchor text, you should craft the anchor text of links based on the keywords you have designated for each destination page. With this device used in connection with sound on-page optimization, tremendous ranking power comes into focus. Remember that Google and the other search engines have a primary goal of returning quality search results to their visitors. When anchor text accords with the on-page elements of a web page, that gives search engines confidence as to the subject of that page. And, when a search engine is confident about subject matter, it rewards the page with high rankings.

But be careful with anchor text when gaining links in high numbers. It is unwise to secure hundreds of links all with picture-perfect anchor text; this manner of link building does not appear natural to search engines. There is a risk of over-optimization when your link anchor text is too perfect. Generally, you never want more than 70% of your anchor text for a particular page to be solely based upon a small family of perfect keywords. Thus, there is a hidden benefit to garnering links for which you can’t control the anchor text because these links dilute your principal keywords to some extent.

If your anchor text isn’t varied naturally, then you should intentionally vary the anchor text. Clever SEO professionals sometimes go as far as to obtain noise links. A noise link is a link with common generic terms used as the anchor text like “click here,” or “website.”

Not all hyperlinks have anchor text. Images can be hyperlinks, but do not use anchor text. In this case, search engines register the link but have no anchor text upon which to determine the subject matter of the link. Links in image maps and flash files suffer from the same limitation. For this reason, such links are less desirable.

Buy the Book Today at Amazon

Footer Links: Bad for Google Rankings, Bad for Clients

Google Says it’s Forbidden and it’s Bad for Clients

(So Why are Agencies Still Doing It?)

While working on our own WordPress SEO, we’ve learned that site-wide footer links have always been dicey at best (you know, “Designed by Denver SEO Company” in the footer of a website). They were easy to detect both visually and algorithmically, and like any site-wide link, they generate potentially thousands of inbound links from one IP/Website. In-content links (one or two from a website’s content rather than hundreds in the footer or sidebar) were always more desirable. TastyPlacement.com has many examples of in-content links throughout the site.

Examples of Obvious Footer Links Are Easy to Find

Here’s an example from an Inc. 5000 company that touts its work for the NFL and Canon, with a footer link on a client website bearing the simple-minded anchor text “Website Design”:

footer-link

The previous example is from an NYC design agency that ranks number 1 for “New York Web Design”.

Does Google Hate Footer Links?

Well, maybe. Google certainly has warned against it. In an October 2012 revision to its webmaster guidelines, Google warned about “[w]idely distributed links in the footers of various sites.” A valuable discussion on Webmaster World regarding footer links followed. Certainly, the use of footer links, especially when used with aggressive anchor text, should be undertaken with caution. Just as certain though is that footer links can still generate strong rankings.

Footer Links and Client Responsibility

There’s another facet though to this question, and that is the question of taking footer links on your clients’ websites. If you are a website designer or an SEO, when you take a footer link on a client website, you doing a few things:

  • You are using your superior knowledge of the search algorithms to get a link from someone who trusts you; they might not give the link so willingly if they knew all the facts and consequences.
  • You are exposing your own web property to an inbound link that violates Google’s written webmaster guidelines.
  • You are exposing your client’s website to a potential Google violation.
  • You are taking Page Rank from a client and giving it to yourself.
  • You have a more effective and safer alternative, an “About This Site” page or its equivalent–still sorta’ sleazy, but maybe not so obvious.

If you want the possible referral business that a prestige web build might generate, you can always achieve that with a simple text citation, with no link.

 

From the Wordpress SEO book

Should You Disallow Old Link Structures With Robots.TXT?

Questions from Readers…

We’re getting great questions from readers of our book, WordPress 3.0 Search Engine Optimization. Today, Michael tackles a question sent in by Jeff of Houston, TX. Remember, send in those questions and feedback! We’re always  thrilled to help out our readers.

Hi Mr. David,

I’m sorry to contact you with such an insignificant matter, but I just got your book today and wanted to ask if you could clarify an issue that I have encountered. My site has been up for about 6 months and I had been using a permalink structure of /year/month/day/postname and I changed it to /category/postname. I also used Deans Permalink Migration plugin to add 301 redirects for published posts.

I want to use your Ultimate Robot.txt file to my site, but I’m wondering if I add the “Disallow: /2011/ ” directive to eliminate duplicate content in my archives, will it disallow my previous posts that had /2011/ in the old permalink structure? Any help or clarification on this issue would be very appreciated. Thank you for your time.

Jeff

Houston, TX

Jeff,

We love hearing from readers.

Yes, I believe that if you add the directive Disallow: /2011/ you will remove year archives from indexing, but also any post that uses the year in that position as part of its permalink structure. I tested it, and it appears to disallow the content.

You can test your robots.txt file by using Google Webmasters’ Crawler Access testing tool. The tool lets you test the text of a robots.txt file and compare it to a specific URL. The tool then tells you if your robots.txt file is allowing or blocking the URL. You can find the tool by logging into Google.com/webmasters and then selecting “Site Configuration” and then “Crawler Access” from the left menu. We didn’t cover this specific tip in WordPress 3.0 Search Engine Optimization, but we will implement it in a future edition of the book.

Now, but you say you’ve changed your permalink structure–that should solve the problem. In the case where a robots.txt entry would block regular blog posts from getting indexed when blocking year archives, the solution is clear: don’t block either. Just make sure your year archive is set to display excerpts of the posts, rather than the full text of the posts.

Michael

Buy the Book Today at Amazon

Are Site-wide H1 Tags in WordPress Good or Bad?

Questions from Readers

The great thing about writing our book, WordPress 3.0 Search Engine Optimization, is we get to hear from all those readers who have taken our material and put it to work in the field. Today, we’ve got a fascinating question from Robert, who asks that question we confront every day in one way or another: Just how far should I trust Google’s sophistication?

Hi Michael,

I’m currently reading your Packt book on WordPress SEO, and I have a quick question about HTML5 and the way it uses header tags. Your book says to use only one H1 tag per page, which makes sense. However, HTML5 advocates multiple H1 tags per page, as long as each is contained in a separate section/header.

Worse yet, the first H1 tag on a page is usually a wrapper around the home link logo and contains the same meaningless title text on every page. You can see a typical example at CSS3maker.com :

<header>

<h1 id=”logo”><a href=”index.html” title=”CSS 3.0 Maker”>Css 3.0 Maker</a></h1>

</header>

Most SEO bloggers assume single H1 tags are a thing of the past. Based on your experience, has there been any evidence that Google/Yahoo interpret HTML5 content any differently than HTML/XHTML?

If not, should I remove the header and h1 tags around my logo anchor tag? My site looks like the CSS3maker code above. And like them, I don’t have anything else in my header, so if I remove the H1 tag, wouldn’t I also just scrap the header tag? I have a meaningful H2 tag in my content section, which could be elevated to an H1 tag.

Thanks,
Robert

BTW, I’m really enjoying your book.

 

Robert,

This may be a cop out…but does this help?

I think google is tuned in enough to ignore site-wide h1 tags. One of my philosophies is “packaging”–make it so brain-dead easy for a search engine that it can’t POSSIBLY get confused. We are sort of on-page nerds when it comes to that stuff. Most of the pages we create are pretty perfect, at least on the page.

Do we, in our SEO business, remove site-wide h1 tags around logos and site names in the header? Absolutely we do, but I don’t think it’s the kiss of death if you don’t. Remember one thing: google has to fit its algorithm so that it doesn’t punish sites for small mistakes–otherwise, it would punish 80% of the web or more.

I am very glad you are enjoying the book!

Michael

Buy the Book Today at Amazon

How to Diagnose a Google Penalty

How to Diagnose a Google Ranking Ban, Penalty, or Filter

The following is an excerpt (with some recent modifications and editorial comments)  from our book WordPress Search Engine Optimization (now in second edition!). You can buy the book at Amazon.

If you undertake black or gray hat techniques, you run a fair chance of having your site penalized in the search results. But even if you are not engaged in these techniques yourself, your site may be punished for associating with black hat purveyors. Hosting on a shared server or sharing domain registration information with bad neighborhoods can lead to to ranking problems, if not punishment. Certainly linking to a bad neighborhood can lead to discipline. If you purchase a domain, you’ll inherit any penalties or bans imposed on the prior version of the website.

There are a wide range of penalties and ranking filters that search engines impose and a still-wider range of effects that those penalties produce. In diagnosing and correcting ranking problems, more than half the battle is figuring which penalty, if any, is imposed and for what violations. Ranking problems are easy to fix but arduous to diagnose with precision. Sudden drops in rankings might lead you to suspect that you’ve received a penalty, but it might not be a penalty at all.

In the following section we’ll look at some specific penalties, filters, conditions, and false conditions, and how to diagnose ranking problems.

Google Ban

The worst punishment that Google serves upon webmasters in a total ban. This means the removal of all pages on a given domain from Google’s index. A ban is not always a punishment: Google “may temporarily or permanently remove sites from its index and search results if it believes it is obligated to do so by law.” Google warns that punishment bans can be meted out for “certain actions such as cloaking, writing text in such a way that it can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in removal from our index.”

One of the most newsworthy instances of a total ban was when Google, in 2006, issued a total ban to the German website of carmaker BMW (http://www.bmw.de). The offense? Cloaked doorway pages stuffed with keywords that were shown only to search engines, and not to human visitors. The incident became international news, ignited at least partially by the SEO blogging community. BMW immediately removed the offending pages and within a few weeks, Google rescinded the ban.

How to Diagnose a Total or Partial Ban

To diagnose a full or partial ban penalty, run the following tests and exercises:

  • Check Google’s index. In the Google search field, enter the following specialized search query: “site:yourdomain.com.” Google then returns a list of all of your site’s pages that appear in Google’s index. If your site was formerly indexed and now the pages are removed, there is at least a possibility that your site has been banned from Google.
  • Check if Google has blacklisted your site as unsafe for browsing (type http://www.google.com/safebrowsing/diagnostic?site=mysite.com with your domain at the end).
  • Check for Nofollow/Noindex settings. It might seem obvious, but check to make sure you haven’t accidentally set your WordPress site to Noindex. To check, go to your WordPress Dashboard and click the “Privacy” option under “Settings.” If the second setting, “I would like to block search engines, but allow normal visitors” is set, then your site will promptly fall out of the index. A stray entry in a robots.txt file or in your WordPress template file can instruct search engines not to index your entire site.
  • Check Google Webmaster Tools. Sometimes, but not always, Google will notify you through your Webmaster Tools account that your site has been penalized. But you won’t always receive this message, so you can still be penalized even if you don’t receive it. See the image below for an example message.

Google Webmaster Tools penalty message. In this example, the message notes, “we detected hidden text….”

PageRank Adjustment/PageRank Penalty

An alternative penalty short of an outright ban is a PageRank adjustment. The adjustment can be partial (a drop from a PR4 to a PR2) or can be full (a drop to PR0). With a PageRank adjustment, Google simply adjusts or removes the PageRank value for a site. Google often imposes this punishment upon low-value general directories that sell links. Part of the difficulty with diagnosing and repairing a PageRank penalty is that the PageRank that Google shows to users is historical, sometimes six months pass between PageRank updates.

How to Diagnose a PageRank Penalty

To diagnose a Google PageRank penalty, run the following tests and exercises:

  • Check your inbound links. Whenever your PageRank drops, the most likely reason is that you’ve lost valuable links. Check your link profile in Yahoo Site Explorer. Have you lost any premium, high-PR links you had formerly? Use the reliability of the PageRank algorithm to help diagnose: if you have a PR4 link pointing into one of your pages, and that PR4 link has only one outbound link, that one link alone will be strong enough to make the destination page a PR1 or a PR2. If despite such a link your page remains a PR0, that raises the likelihood of a PageRank penalty.
  • Check all pages. Be sure to check every page on your site, you might just have your PageRank shifting around within your site. It is true, however, that generally your home page will have the highest PageRank value of any page of your site. So, if you’ve got a PR0 on all pages including the homepage, a PageRank penalty is suspect.
  • Check canonicalization. Recall the “www” and “non-www” distinction and that search engines see these as separate domains in some cases. WordPress handles this automatically, but some online tools don’t check this for you so you have to be sure your are checking both the www and non-www versions of your domain.
  • Compare PageRank. Compare Google’s reported PageRank score for your pages with SEOmoz’ mozRank. Typically, these two scores will correlate loosely (within about 10%). If the Google score is much lower than the SEOmoz mozRank score, it’s likely that Google is trimming some PageRank. You can see the SEOmoz Page Rank score with the free SEO Site Tools plugin or by visiting http://www.opensiteexplorer.org/.
Page Rank Penalty

Visible evidence of a Google ranking penalty in the SEO Site Tools plugin; all the elements of a ranking penalty are present. The inbound link count is healthy with over 3,500 links pointing to this domain. SEOmoz’ mozRank (erroneously called “Page Rank” in the screenshot) is a healthy 4.41. Nevertheless, Google’s PageRank is a zero. This is clear evidence of a Google PageRank penalty.

  • Check internal links. In Google Webmaster Tools, Google reveals its profile of internal links on your site. See the figures below for examples of an unhealthy internal link profile, and a healthy link profile. If your site has 100 indexed pages, but Webmaster Tools references only a handful of links, it means that Google is not properly processing your internal links. We need to be careful here because a range of conditions can cause this. It can potentially arise from a PageRank penalty but also from poor internal navigation structure.
Unhealthy Link Profile

This Google Webmaster Tools screenshot shows an unhealthy internal link profile, and is the same site shown in the screenshot just above. This site is a low-value link directory, a likely candidate for a Google PageRank penalty.

Healthy Link Profile

This Google Webmaster Tools screenshot shows a healthy link profile. All or nearly all pages on the website are represented on the internal link profile and the numbers of links to each page is relatively constant.

The -950 Ranking Penalty

Google sometimes employs a -950 ranking penalty to individual pages (but not to entire sites) for particular search queries. The -950 penalty means that for a particular search, your page would have 950 positions added to it. So, a term for which you ranked on page one of Google’s search results in position three, you’d now rank on page ninety-five of the search results at position 953. Sound harsh? It is, and Google has made faint references to it as a penalty for over-optimization. Some SEO professionals contend that they have seen the penalty imposed for shady link building practices.

How to Diagnose a -950 Ranking Penalty

Diagnosing a -950 ranking penalty is easy: try search terms for which you formerly ranked (hopefully you noted their exact former position) and follow the search results out to page 95 or 96. Remember that you can always set Google to display 100 results instead of ten by using the advanced search option at Google.com, which is convenient for checking ranking position in the 100s and above.

The -30/-40 Ranking Penalty

Google often serves up another variety of penalty: it’s the -30 or -40 position penalty. This is an often-imposed penalty, and is applied by Google to entire sites, not just particular pages and not just for particular search queries. This penalty is common enough to trip up legitimate webmasters for very minor oversights or offenses. Most signs point to the -30 penalty being applied algorithmically and is “forgivable,” so changing the condition that led to the penalty automatically reverses the penalty. This penalty has historically been imposed upon sites for serving up poor quality content. For example, the penalty has been imposed upon sites that display thin content. Thin content is content that is partially generic, as with an affiliate site repeating common descriptions of products it sells. Low-value directories have also been served this penalty.

How to Diagnose a -30/-40 Penalty

If you suspect that your site has been been hit with a -30/-40 penalty, there is one sure-fire test to determine if you tripped the penalty. Perform a Google search for your domain name, with out the “www” and without the “.com” or “.net” part of the domain. This search, in normal circumstances, should return your site at or near the first position (depending a bit on the competition of that term). If this test yields your site showing up in a position dropped to the 40s or 50s, it is almost certainly is a -30/-40 penalty.

False Positives That Aren’t Penalties

Don’t assume you’ve been penalized by Google just because your rankings drop or because your rankings remain poor for a new site. Ranking positions can jump around naturally, especially just before algorithm updates, when Google updates its search engine rules. You may also have lost one or more valuable inbound links, that can lead to a drop in rankings. You may also be alternating between Google’s personalized search modes. Personalized search is a Google feature that returns results based on your personal browsing habits. So, if you’ve visited your own website in the past few days, Google will return your website near the top of the results, figuring that it’s one of your personal favorites. Personal search is a convenience tool, but it doesn’t return true rankings. To see actual ranking results you need to make sure personalized search is off. To do this, look on any Google search results page in the upper left hand corner for “Personalize Search On.” Click on the link just under it that reads, “Turn it off.”

Google penalties are almost never imposed for no reason at all. Yes, Google imposes penalties on light offenders while more egregious violations go unpunished. While that might not seem fair, it doesn’t change the fact that if you have perfectly complied with Google’s Webmaster Guidelines, you are extremely unlikely to be penalized. If you’ve been penalized, there’s a reason.

Pubcon Austin 2013

Does Google Hate Forum Sites? Millions of Pages Disappear From Index

Millions of Pages Disappear From Index (or Do They?)

We at TastyPlacement have noticed that millions of pages of popular forums sites have disappeared from Google’s index in the past few months. Testing the index, a Google search for the custom string “site:forums.digitalpoint.com,” which would normally return tens of millions of pages, returns only 256,000 results. Similarly, the tremendously popular forum site, fanforum.com, shows only 58,000 indexed pages in Google. Typically the “site:” search query generates a reliable count of the number of pages indexed by Google for a particular domain.

Testing in other niches shows that the trend is broad: ClubLexus.com shows only 57,000 indexed pages–a narrow fraction of its total page count. If you prowl around, you’ll see the same thing: notebookforums.com, sitepoint.com, every forum site I checked gave the same result.

Here’s the kicker: all of the pages within these forums sites appear to be searchable–the pages are indexed and searchable but not reported as indexed,

Hmmm…not sure how to parse this but it could reflect Google making a formal, algorithmic devaluation in the way it treats forums pages. This change might mean that links on those forum pages would be significantly devalued.

From the Wordpress SEO book

SEO Master Class: The Mathematics and Operation of Google PageRank

The following is an excerpt (with some recent modifications and editorial comments) from our book, WordPress Search Engine Optimization. You can buy the book at Amazon.

The Mathematics and Operation of Google PageRank

Google’s PageRank is part of its search algorithm; the other search engines’ ranking algorithms work similarly. Yahoo and Bing, while they obviously measure inbound link counts as a ranking factor, do not disclose to web users any measure of page value equivalent to PageRank. PageRank works through complex mathematics. Understanding the mathematical intricacies is not vital, but can help illuminate how PageRank impacts your link building efforts. PageRank works the same on all platforms, WordPress or otherwise.

The PageRank Calculation

PageRank calculations works as follows: Google assigns a numerical value to each indexed page on the Web. When an indexed page hyperlinks to another page on the Web a portion of that numerical value is passed from the linking page to the destination page, thereby increasing the destination page’s PageRank. Inbound links increase the PageRank of your web pages and outbound links decrease PageRank. PageRank, often abbreviated as “PR,” is expressed as a number from 0 to 10. Google.com and Facebook.com, both of which benefit from millions of inbound links, enjoy a PageRank of 10. In common parlance, a PageRank 10 site is referred to as a “PR10 site.” Remember though that PageRank refers to pages on the web, not just sites themselves. A PR5 site simply means that the site’s front page is a PR5.

So how is PageRank specifically calculated? Every indexed page on the web enjoys a small amount of PageRank on its own, a PageRank score of 1. This inherent PageRank is the original source of all PageRank on the web; it is only through linking between pages and sites that some pages accumulate higher PageRank than others. However, a page can never send all of its PageRank to other pages—this is where the damping factor comes into play. The damping factor is simply a number between 0 and 1 (but think of it as zero to 100 on a percentage scale); it represents the amount of PageRank that can be sent away from a page when that page links out to other pages.

If a search algorithm’s damping factor were set to zero, no page would ever send PageRank away, and the entire PageRank calculation becomes pointless. On the other hand, if the damping factor is set to 1, then 100% of a page’s PageRank is sent away through outbound linking, and any page with any outbound links retains no PageRank. In this case, the algorithm also fails—the internet would be populated entirely sites of either PR0 or PR10 with no sites in between. As it happens, the damping factor employed by Google is widely believed to be .85. This means that 85% of a page’s PageRank is available to be passed to other pages through linking, while 15% of a page’s PageRank will always be retained. It is believed that Google can alter the damping factor for particular sites.

Consider for a moment that Google manages PageRank calculations for billions of web pages. If that wasn’t daunting enough, consider that Google undertakes the even more staggering task of managing the mathematical calculations of immeasurable numbers of links between those billions of sites.

PageRank, Diagramatically

This graphical illustration of Pagerank calculations for a hypothetical group of web pages shows that the PageRank distribution is accumulated in site “B” because it enjoys a high number of links. The sites represented by the small circles at the bottom of the illustration retain only 1.6% of the PageRank distribution because they link outward and have no inbound links. Note also that site “C” enjoys a healthy amount of PageRank simply because it enjoys a single link from site “B.”

You Have to Share Your PageRank

Also bear in mind that the amount of PageRank available to be passed by a page will be equally divided among all the outbound links on that page. So, if a webpage has a total of six links: three internal links and three external links (links to outside websites) then the PageRank passed away by that page will be shared equally among the six links on that page.

What does that mean for the link builder? Well, it means that if you have secured a link on a great PR4 page, but that page has 200 outbound links, then you’ll be sharing the available PageRank with 199 other sites. That’s why you want to seek out pages with low numbers of outbound links. When there are fewer outbound links, your link will enjoy a much greater percentage of the available PageRank.

The Logarithmic PageRank Scale

If the mathematics underlying PageRank weren’t complicated enough, there is another facet that you must consider. The PageRank scale of PR1 to PR10 isn’t linear, it is logarithmic. Therefore, it takes ten times as much linking power to rise from a PR2 to a PR3 page. Expressed another way, a PR4 page has 100 times the linking power of a PR2 page. As each level of PageRank is reached, it becomes harder and harder to reach the next level. There are only about 120 to 150 PR10 pages at any given time, and generally this elite class of pages and sites includes Google.com, Microsoft.com, WhiteHouse.gov, and other sites of equivalent popularity and character.

PageRank Is Historical

PageRank is historical and only updated every three months or so (although sometimes much longer periods pass between PageRank updates, it’s really up to the whim of Google)—when you check the PageRank of a page, you aren’t seeing the current PageRank, you are seeing the PageRank reported as of the last PageRank update.

Buy the Book Today at Amazon

Google Hides “Dead” Girl on Google Maps: “This Image Is No Longer Available”

Google Hides “Dead” Girl on Google Maps: “This Image Is No Longer Available”

Last week, we reported on the apparent dead girl on Google Maps. As it turned out, the dead girl was “playing” dead in the street just as Google’s imaging van drove by. The widespread media frenzy over the photo ultimately prompted Google to remove the image from Google Places, reporting a blank image with the message This image is no longer available. You can see the removed image here.

No longer available

The Daily Mail of the UK reports that Google has been forced to remove some images from its “Street View” feature throughout the UK based on privacy protests.

Here’s a screenshot of the original Maps entry showing the young girl in the street:

Pubcon Austin 2013

Google Secure Search: Enhanced Privacy Tool That Che Guevara Would Love

Today, Google offered a new feature to enhance its already-robust search capabilities: Secure Search. Google is the first major search engine to offer search in a secure setting.
Pictured below, the secure search works and looks just like traditional search, but operates on SSL (secure socket layer), and can be found here: https://www.google.com. The “https” references a secure internet browsing location, unlike traditional “http” locations:

What Does it Do?

Google’s secure search protects the transmission of data between a user’s computer and Google’s server. So, a user searching for “ways to pass a drug screening” would enjoy enhanced security for that search: the user’s search query could not be intercepted in transit by persons snooping on internet traffic–which in an unsecured environment, is largely open to viewing by anyone. It’s the same technology that protects the transmission of credit card numbers.

However, it has limitations: the user’s browser settings may retain the search query, making it visible to coworkers, spouses, or anyone with access to the physical computer on which the search was made.

No Change in Search Results

But will this new feature change search results, the order in which search results appear? No. TastyPlacement tested a variety of phrases in both environments, and the search results are unchanged.

Who Won’t Like It?

That’s easy: the Chinese Government and the North Korean Government. These governments snoop on their citizens by intercepting all sorts of internet traffic–including search queries. The secure connection means that a dissident in China can search for information without having his or her search queries read by China’s ubiquitous internet police. Of course, once a person clicks on a link–the visit to the destination website will be visible to snoopers. And, of course, the Chinese government can simply attempt to block access to all of Google’s servers.