Merging Duplicate Yelp Listings: Tutorial for Business Owners (and Analysts)
/0 Comments/in Local Maps and Local Listings, SEO/by Roselly HernandezHaving duplicate Yelp listings for your business may make it difficult for potential clients to find you and is not ideal for your Local Campaign.
If you are a business owner, but not necessarily Yelp-savvy, you might not be aware that you created duplicate listings for your business. This can happen while you’re trying to create a listing for a different location, update your business info, or change your business name – which was the case for our client.
If you have a similar issue, this tutorial can help you, or at the very least, point you in the right direction.
When merging duplicate Yelp listings, make sure you are logged in as a business owner because it will make things much easier. In the back-end, there will be a number provided: (877) 767-9357. This is the number for the Yelp Advertising department.
When I called, I was helped with my request by a Senior Account Executive. Luckily for me, the executive emailed some info regarding my request, and now I have access to his email as well as direct line for future inquiries. However, if you are not emailed by someone who works at Yelp, you should still ask for their contact info for your reference. Now, back to merging listings.
In detail, let whomever is helping you know exactly what you’re trying to achieve. In this case, I was trying to merge two duplicate listings into one and update the business name.
In this instance, since we were working with a doctor’s personal listing as well as his practice, he had to give written consent to Yelp. The executive at Yelp emailed me the following statement to forward to the doctor to sign:
I, __________ , give Yelp my consent to merge my individual Yelp listing with my practice listing. I understand that this will not be undone in the future, and I will lose out on the ability to take any photos, reviews, tips, videos, or other content with me in the future should I move locations or practices.
It took some time, but I got this back from the doctor and emailed it immediately to the Yelp executive. I did not hear back from him for a couple of weeks, so I called his direct line (again, I strongly urge you to record someone’s contact info at Yelp).
The Yelp executive said he remembered our first conversation and would complete my request right away. I checked the listings about an hour later, and…
TA-DAH! The listings had been merged.
Merging duplicate Yelp listings can actually be quite simple. The whole process should take about a week or so if you are checking on it regularly. It will take effective communication and possibly some phone calls, but you can do it too!
Website Visitors Drop After Switch to HTTPS
/2 Comments/in SEO/by Michael DavidGoogle announced recently that it would be rewarding websites converting to https by increased ranking position.
Ok, sure, fine, WHATEVER.
But wait, maybe this is a good thing? Let me spend 6 or 7 hours on this to find out.
Thanks for Nothing, Google
Well, things didn’t work out for us on one of our web properties. We converted to HTTPS on the day after Google’s announcement. Our traffic declined by over 18% compared to the same period in the prior month (comparing 14 days of traffic from a Thursday to a Wednesday, to avoid weekend/weekday fluctuations).
Conclusion: I Heart Hate You, Google
Footer Links: Bad for Google Rankings, Bad for Clients
/5 Comments/in SEO/by Michael DavidGoogle Says it’s Forbidden and it’s Bad for Clients
(So Why are Agencies Still Doing It?)
While working on our own WordPress SEO, we’ve learned that site-wide footer links have always been dicey at best (you know, “Designed by Denver SEO Company” in the footer of a website). They were easy to detect both visually and algorithmically, and like any site-wide link, they generate potentially thousands of inbound links from one IP/Website. In-content links (one or two from a website’s content rather than hundreds in the footer or sidebar) were always more desirable. TastyPlacement.com has many examples of in-content links throughout the site.
Examples of Obvious Footer Links Are Easy to Find
Here’s an example from an Inc. 5000 company that touts its work for the NFL and Canon, with a footer link on a client website bearing the simple-minded anchor text “Website Design”:
The previous example is from an NYC design agency that ranks number 1 for “New York Web Design”.
Does Google Hate Footer Links?
Well, maybe. Google certainly has warned against it. In an October 2012 revision to its webmaster guidelines, Google warned about “[w]idely distributed links in the footers of various sites.” A valuable discussion on Webmaster World regarding footer links followed. Certainly, the use of footer links, especially when used with aggressive anchor text, should be undertaken with caution. Just as certain though is that footer links can still generate strong rankings.
Footer Links and Client Responsibility
There’s another facet though to this question, and that is the question of taking footer links on your clients’ websites. If you are a website designer or an SEO, when you take a footer link on a client website, you doing a few things:
- You are using your superior knowledge of the search algorithms to get a link from someone who trusts you; they might not give the link so willingly if they knew all the facts and consequences.
- You are exposing your own web property to an inbound link that violates Google’s written webmaster guidelines.
- You are exposing your client’s website to a potential Google violation.
- You are taking Page Rank from a client and giving it to yourself.
- You have a more effective and safer alternative, an “About This Site” page or its equivalent–still sorta’ sleazy, but maybe not so obvious.
If you want the possible referral business that a prestige web build might generate, you can always achieve that with a simple text citation, with no link.
Tutorial: Block Bad Bots with .htaccess
/4 Comments/in SEO/by Michael DavidIn this tutorial, we’ll learn how to block bad bots and spiders from your website. This is a standard safety measure we implement with our WordPress SEO service. We can save bandwidth and performance for customers, increase security, and prevent scrapers from putting duplicate content around the web.
Quick Start Instructions/Roadmap
For those looking to get started right away (without a lot of chit-chat), here are the steps to blocking bad bots with .htaccess:
- FTP to your website and find your .htaccess file in your root directory
- Create a page in your root directory called 403.html, the content of the page doesn’t matter, our is a text file with just the characters “403”
- Browse to this page on AskApache that has a sample .htaccess snippet complete with bad bots already coded in
- You can add any bots to the sample .htaccess file as long as you follow the .htaccess syntax rules
- Test your .htaccess file with a bot spoofing site like wannabrowser.com
Check Your Server Logs for Bad Bots
If you read your website server logs, you’ll see that bots and crawlers regularly visit your site–these visits can ultimately amount to hundreds of visits a day and plenty of bandwidth. The server log pasted above is from TastyPlacement, and the bot identified in red is discoverybot. This bot was nice enough to identify its website for me, but DiscoveryEngine.com touts itself as the next great search engine, but presently offers nothing except stolen bandwidth. It’s not a bot I want visiting my site. If you check your server logs, you might see bad bots like sitesnagger, reaper, harvest, and others. Make a note of any suspicious bots you see in your logs.
AskApache’s Bad Bot RewriteRules
AskApache maintains a very brief tutorial but a very comprehensive .htaccess code snippet here. What’ makes that page so great is that the .htaccess snippet already has dozens of bad bots blocked (like reaper, blackwidow, sitesnagger) and you can simply add any new bots you identify.
If we want to block a bot not covered by AskApache’s default text, we just add a line to the “RewriteCond” section, separating each bot with a “|” pipe character. We’ve put “discoverybot” in our file because that’s a visitor we know we don’t want :
# IF THE UA STARTS WITH THESE RewriteCond %{HTTP_USER_AGENT} ^(verybadbot|discoverybot) [NC,OR]
If you are on the WordPress platform be careful not to disrupt existing entries in your .htaccess file. As always, keep a backup of your .htaccess file, it’s quite easy to break your site with one coding error. Also, it’s probably better to put these rewrite rules at the beginning of your .htaccess file so no pages are served before the bots read the rewrite directives. Here’s a simplified version of the complete .htaccess file:
ErrorDocument 403 /403.html RewriteEngine On RewriteBase / # IF THE UA STARTS WITH THESE RewriteCond %{HTTP_USER_AGENT} ^(black.?hole|blackwidow|discoverybot) [NC,OR] # ISSUE 403 / SERVE ERRORDOCUMENT RewriteRule . - [F,L]
Here’s a translation of the .htaccess file above:
- ErrorDocument sets a webpage titled 403.html to serve as our error document when bad bots are encountered; you want to create a page in your root directory called 403.html, the content of the page doesn’t matter, our is a text file with just the characters “403”
- RewriteEngine and RewriteBase simple mean “ready to enforce rewrite rules, and set the base URL to the website root”
- RewriteCond directs the server “if you encounter any of these bot names, enforce the RewriteRule that follows”
- RewriteRule directs all bad bots identified in the text to our ErrorDocument, 403.html
Testing Our .htaccess File
Once you upload your .htaccess file, you can test it by browsing to your site and pretending to be a bad bot. You do this by going to wannabrowser.com and spoofing a User Agent, in this case, we spoofed “SiteSnagger”:
If you installed properly, you should be directed to your 403 page, and you have successfully blocked most bad bots.
Some Limitations
Now, why don’t we do this with Robots.txt and simply tell bots not to index? Simple: because bots might simply ignore our directive, or they’ll crawl anyway and just not index the content–that’s not a fix. Even with this .htaccess fix, it’ll only block bots that identify themselves. If a bot is spoofing itself as a legitimate User Agent, then this technique won’t work. We’ll post a tutorial soon about how to block traffic based on IP address. But, that said, you’ll block 90% of bad bot traffic with this technique.
Enjoy!
Google Announces New Link Disavowal Tool
/3 Comments/in SEO/by Michael DavidGoogle’s Matt Cutts announced at the PubCon marketing conference that Google is rolling out a new much-anticipated link disavowal tool. Bad links from poor quality sites can harm a site’s rankings in Google, and Google has implemented this tool to let webmasters remove bad links from their link profile. TastyPlacement has been able to use this tool for clients at risk of being associated with spammy or shady sites. This feature can be extremely helpful in our WordPress SEO Service when we are identifying specific client issues.
The tool will operate by uploading a txt file that contains a list of domains that a webmaster wishes to disavow. There will also be a domain: operator that will let a webmaster disavow all links from a domain.
Here’s a sample of what a disavowal tool text file would look like:
http://www.shadysites.com/bad-post http://www.shadysites.com/another-bad-post domain: http://wwww.reallyshadysite.com
The tool is ostensibly live at the following domain:
https://www.google.com/webmasters/tools/disavow-links-main
Cutts warns that it’s always better to have bad links removed rather than disavow links with the tool, but recognizes that’s not always possible.
Freddie Mercury’s Guide to SEO
/2 Comments/in SEO/by Claire J. DunnWant to be an awesome SEO or digital marketer? Seek out Those of Epic Awesomeness and learn from them. If you think like a legend in your work life, your work will be legendary. And, after what must be my 5000th listening of A Night at the Opera, I started thinking about what lessons Freddie Mercury can teach internet marketers.
Surround Yourself With Great People
Freddie Mercury had great co-workers. Guitarist Brian May played for a decade on a guitar he built with his father in the family tool shed. The guitar had an advanced tremolo system that wasn’t available at the time. Brian’s guitar fueled millions in record sales for Queen. Today, Brian owns his own guitar company that produces a commercial replica of his homemade guitar.
Queen’s bassist John Deacon began tinkering with piano in the mid-70s. Almost immediately after beginning to learn the new instrument, he crafted the now-iconic song “You’re My Best Friend.” John has said, casually, “basically that’s the song that came out you know when I was learning to play piano.” The song was a major hit, and helped push Queen’s “A Night at the Opera” album to triple-platinum sales.
Great people produce great things. You can’t do everything alone, so partner up or populate a staff with people of great talent and your work product will be great.
Appreciate Your Clients, They’ll Bring You Fame and Fortune
You are nothing without your clients (or your readers and buyers as the case may be). Freddie honored and loved his fans. “You brought me fame and fortune and everything that goes with it; I thank you all,” he insists in “We Are the Champions.” Your clients are gold. Treat them that way and you’ll enjoy riches and be a champion for years to come.
Build on the Past, but Be Original
Queen regularly employed elements of the past in their music. They drew heavily upon the growing heavy metal movement (at the time pioneered by Led Zeppelin and Black Sabbath) but also included less-expected elements like classical and opera. “It’s unheard of to combine opera with a rock theme, my dear,” Freddie once remarked. He also once said that “the whole point of Queen was to be original.” Queen knew where to ground themselves and where to branch out and be original. The resulting effect was a supernatural but radio-ready rock sound.
You can do the same. Web marketing draws on a foundation of concepts that originate with traditional advertising. Web marketers must honor well-settled advertising concepts like calls to action, customer conversion, branding, etc. After all, the psychology of the buyer hasn’t changed much since the beginning of time. But web marketing presents infinite opportunities to be original. When you blend a solid marketing foundation with creative and innovative ideas, you will excel, and you will succeed. Don’t simply imitate the successes of others, build on the successes of others with your own spin.
Be Fabulous and Think Big
“I always knew I was star, and now, the rest of the world seems to agree with me,” Freddie once mused. No one ever accused Queen or Freddie of thinking small — even before they were famous. In marketing as in rock music, it’s the big ideas that get the most attention. If you are building a website for a client or for one of your own properties, make it the best-in-class for that space. Queen wasn’t done with a song until they had lavished it with operatic 4-part harmonies. Aim high and people will love your product.
If you have an idea for a blog post article, why not go for broke and develop it into a full-scale infographic? We re-learned this lesson recently when our infographic study of social media impact on search signals went viral, earning us thousands of social media mentions and hundreds of links from the SEO and entrepreneurial community. Had we issued that material as a blog post, it might have been lost in the shuffle of thousands of other blog posts. We took the route a champion would take and it paid off.
When web content goes viral, it’s the same as when a band gets famous: when people love something, they tell their friends.
Pictures From Pubcon Paradise 2012
/0 Comments/in SEO/by Michael DavidI thought this was appropriate as an introduction, from the opening evening networking event sponsored by TastyPlacement:
Day One of Regular Pubcon Sessions:
Evening Networking Party Sponsored by the Social Media Club of Hawaii:
Day Two of Pubcon Regular Sessions:
Wrap-up Party at Jimmy Buffet’s
Siri Search Optimization
/0 Comments/in Mobile SEO, SEO/by Michael DavidYou may have heard that the iPhone’s new voice-command and personal/search assistant “Siri” will be “the end of SEO as we know it.” Undoubtedly a shift is coming, but I for one doubt it will be as disruptive as the apocolyptos might have you believe. After all, we’re not all going to use only our phones for everything. We like our laptops, and in addition, bargain hunting (AKA commercial search) is deeply ingrained in human nature.
There are a lot of fun things Siri can do including transcribing text to voice, setting reminders, playing music, checking the weather, getting directions, and yes carrying out search queries. Undoubtedly, Siri will catch on like wildfire, and as a result will compete with many apps and tools, including search engines.
Optimizing for Siri
The integration of Siri will begin to affect strategies and optimization efforts, but most of these things should be part of an immersive SEO program from the start.
Local Search for Siri
People search from mobile devices on the move; they’re not sitting down to do in-depth research. A majority of mobile searches are location-specific including directions, finding nearby restaurants, or other local services.
With Siri, it’s not about people getting to your website through Google placement alone because visibility comes from other sources. Siri wants to give users a visual experience and draws data from local listing sites such as Yelp, Google Maps, Citysearch, YP, etc. There are more than 60 of these sites on which it is well worth your time to create a listing. It’s not just for Siri, getting listed on (and links from) all these sites improves local listing and organic placements in SERPs as well.
Obviously, you’ll want your information to be correct, up to date, and fully filled out on these sites with accurate address, phone number, images, positive reviews, and a high number of ratings. For more info on local optimization, check out our post on local listings SEO.
Rich Snippets and Schema Tags
Schema.org lets you use a specific markup language (web code) to identify specific information about your business and web presence and make that information more easily found by search engines.
Search engines are using on-page tags in a variety of ways. Google uses them to create rich snippets in search results and will continue to do so more and more. These snippets include author information, address, phone number, operating hours, and so on. So you can see how these tags have value to local searches such as are the focus of Siri. Offering a highly structured format for this information makes it that much easier to be found.
Variety in Linkbuilding and Long Tail Keywords
This is the first version of Siri, and its depth of language capabilities will continue to increase with new versions. Therefore the following effect will only continue to grow. Already, the length of Siri queries are longer because users are searching in natural speech rather than pecking away at keyboards or small iPhone touchscreens.
The result is more long-tail and highly targeted searches. Optimizing for long-tail means more words on the page and more flexible link building. Both of these strategies work in organic search as well, so you won’t even have to duplicate your efforts.
It used to be that you chose your anchor text and could simply bang away at it over and over. With enough links, you’d move on up. That hasn’t been best practice for a while, and Google is becoming even more focused on natural-looking anchor text profiles. Not only is this a safety-first method, but it’s also more efficient. Flexible anchor text (anchor text with the keyword integrated here and there, but also broadly varied) is more efficient in increasing rankings, even for the targeted, high-volume terms.
Back to Siri, the efforts you make to naturalize and get the most out of your link profile will also help you rank for long-tail searches, which Siri is all about. As a bonus, long tail searches are more targeted to the specific needs of a given search query and therefore convert at higher rates.
The iPhone 4s (S is for Siri? Seems that way to me…) is Apple’s best-selling phone to date, with 4 million sales in three days. Verizon started carrying the iPhone earlier this year and even Sprint has had no choice but to jump on the bandwagon. It’s a monolith, and it’s the impetus for a new fold of search optimization.
Highlights From PubCon Vegas 2011
/0 Comments/in SEO/by Michael DavidI’ve just returned from PubCon Las Vegas 2011 where I spoke on Hosting Issues and SEO, and Ways to Monetize a Blog. Bruce Clay’s staff did a great job of summarizing the Monetizing Your Blog segment, complete with some screenshots. It was a great conference with lots of national leaders in the disciplines of SEO, social media, and internet marketing.
Leo Laporte’s Keynote Address:
Marketing in the Social Era and the Future of Search
Leo Laporte is an Emmy Award winning veteran of technology broadcasting, and a great thinker with respect to internet and marketing. He had some noteworthy messages.
Leo offered some insights into where advertising and marketing has evolved to the present day. If we look back to say, 1890, and examine a Sears catalog, we’ll see basic descriptions of “features and benefits”–no marketing fluff there. But as the 20th Century progressed, marketers injected skill and technique to bend a product’s message to appeal to buyers on an emotional level, or to force brand identities upon consumers. An effective technique to be sure, but not necessarily in the interests of consumers. A related idea: “brands are the refuge of the ignorant.” In other words, a brand is what a consumer refers to when they have no true benchmark for the underlying quality or suitability of a product or service. More recently though, in the very recent few years, consumers have come to depend on online reviews, ratings and recommendations from their social circles to make buying decisions. This is a fundamental shift in purchasing motivation. As Leo notes, it’s as if the circle has closed and “features and benefits” now become the linchpin of purchasing decisions. He sees social media and websites with engaged users as the great drivers of purchasing decisions in the present and near future.
Leo also offered some predictions about the future of search engines and Google specifically. He does not feel that Google will be as relevant in the future and went as far to say that Google will have some serious challenges in the future. The example he gave was Apple’s new Siri app. Siri lets users speak a command like “find me a dentist near 78704”. Siri then completes the search and offers the user an answer to that query. Note that something very fundamental just changed: the interface (Siri) now controls how the query is executed, rather than the user (as is the case with a simple search at Google.com). So, if Apple chooses to direct Siri queries to Google, then Google controls the query. Leo noted that the internet was a “disintermediary”–it killed travel agents because users could simply make their reservations at the airline directly. Services like Siri are “re-intermediaries”– they insert themselves between the user and the search engine. So, theoretically, if a manufacturer like Apple can control the user interface (as is the case with Siri), Apple can control the search, thereby threatening Google.
Google’s Matt Cutts and Amit Singhal Talk About Upcoming Initiatives at Google
Matt Cutts, the head of Google’s spam team (he also authors and appears in Google’s Webmaster Central Channel Videos on YouTube), and Google’s Amit Singhal spoke at the commencement of the 2nd day of the session. Conferences like this are a great way to learn what Google thinks is important and how they value sites and decide rankings.

I ran into Matt Cutts at the Wynn Casino. He was gracious enough to spend a few minutes speaking with me and my wife.
Beside some expected disagreement with Leo Laporte’s earlier warning that Google was in big trouble, one particular highlight caught my attention:
Google Testing an “Above the Fold” Algorithm
Matt Cutts stated that Google was testing an “above the fold” algorithm change that “so far…looks pretty good.” The term “above the fold” refers, simply, to the top of a webpage. The term is inherited from the newspaper industry, the fold being the upward-facing part of the newspaper as it lays flat. This algorithm change would look for quality content at the top of a webpage. So, if a particular website were stuffed with ads, that website might not perform as well in search following the change. No word on if or when this will be implemented.
Some additional insights into search headed forward: Mobile search will continue to grow in importance, and Google will be working to continue to build quality in that area. Social sharing and activity will continue to be more important headed forward. Matt Cutts also proposed a way to protect the original creators of content from having that content stolen by scraper sites. He said that Google may soon begin allowing a notification system that works as follows: when new content is created by a website owner, the creator can ping Google with an alert that confirms that “I am the creator of this content and all other copies are not to be indexed and appear in search.” If implemented, this improvement will be a valuable tool for website owners. Matt also spoke about author reputation in search, noting that author reputation and authority can serve as a great measure of the value of content created by those authors.
General Ideas Presented at PubCon
From the broad pool of speakers, some general ideas emerged that present great opportunities for ranking, placement, and visibility headed forward.
Author Profiles in Search Results
This idea echos what Matt Cutts said about author reputation. In the screenshot below, you’ll note a screenshot image of some search results with a thumbnail photo of Matt Cutts and the text “by Matt Cutts – In 135,595 Google+ circles”. This is a recent search feature that links an author’s content to his or her Google+ profile.
This feature is easy to implement by tagging content with a link to each author’s Google+ profile. We’ll be rolling this out for our clients in upcoming weeks, and of course, we’ll be working with clients to help build out properly optimized Google+ profiles. It is also important to note that this connection between author profiles will add authority to the content and potentially can increase the ranking positions of such content. In a WordPress environment, this feature will apply to posts, but not to your commercial pages, and contact pages, etc.
The Power of Social Sharing
Another prominent theme at PubCon (the second PubCon in a row, actually) is the power of social sharing to increase ranking and visibility. Besides the obvious effect of gaining additional placement through the sharing of content (after all, if content is shared, that means other people will see it), sharing of content can serve as a signal to search engines that “this content is valuable” and “this site is a legitimate source of content.” Remember, Google’s mission is to filter out thin content and deliver valuable content in response to search queries. Social sharing continues to rise in importance as a ranking factor within Google and other search engines.
Site Value Over Page Value
Another topic discussed at PubCon was the continued shift in how Google values individual pages of content. Typically, in the page, Google would tend to rank an individual page based on the merits of that page: the page elements, keywords used, load speed, inbound links. More recently, Google is shifting its focus to the value of the site as a whole. And so, a loose collection of well-optimized pages will not perform as well as a website that has developed overall authority. Ways to achieve this? Start by removing weak content from your site–weak content can actually harm your valuable content by lowering the authority of the site as a whole. Also, social sharing, discussed above, can increase the authority and power of a site.
4150 Freidrich Ln Ste C
Austin, TX 78744
Tel: (512) 535-2492
Google Maps: Get Directions or Read Our Awesome Reviews