With Google Analytics 4 (GA4), Google’s next generation of Analytics, replacing Universal Analytics (UA)  in July, it’s crucial to learn and analyze how GA4 processes data as well as its time frame.

Why is tracking event-based data in GA4 important for search engine optimization?

With GA4, we can analyze events and conversions in reports. We are also able to view pages on the website that are receiving the most traffic as well as how many events/conversions are being tracked on that page. With this data being tracked, we can report back to clients how landing pages are performing, what internal/external links users are clicking on, how many form submissions are being filled out, etc. The number of events and conversions can help determine what SEO efforts need to be made for an overall improved ranking. 

Analyzing how GA4 tracks new events

To test GA4 event tracking, we created a new page on a sample website as well as added an outside link on that page. We ensured that this website had a GA4 property tagged to accurately track the data. To track this link click event in Analytics, we created a new GA4 Event Tag in Google Tag Manager with a trigger that fired when that specific link was clicked. Once the event is created, it takes around 24 hours for this new event to appear in GA4. 

 

Google Tag Manager: GA4 Event Tag

 

Event Names/Counts in GA4

 

How long does it take for event data to track in GA4?

By using this new event we created, we can test the timeframe for this data to appear in GA4. Over the course of a few weeks, we would click the link a few times and monitor GA4 to see how soon the new data would appear. After several trials, the new data for the event seemed to appear in GA4 around 16 hours after clicking the link twice.

Data from 3/22/2023 4:23 p.m.

 

Data from 3/23/2023 9:15 a.m.

 

Why should we monitor data processing time?

Up-to-date data, also known as data freshness, is important to identify trends, patterns, and user behaviors as they occur. This is especially important when using the data for reporting, for example in Google Looker Studio. To see how data updates when used in Looker Studio, we created a new report with a chart of the event data to view how long it takes for the fresh data in GA4 to appear in the report. After testing, the data appears exactly as it does in GA4 as soon as GA4 is up-to-date. This may vary depending on data freshness settings in Looker Studio, but we have an idea of when you can expect to see this data in your report. 

Keeping data freshness in mind, picking the right time frame for the most accurate data is important. For example, since our test calculated new event data won’t process until about 16 hours after it occurred, the data from the previous day is still being processed. To see what your website analytics numbers are looking like, the day before yesterday is the best time frame to view. It will be the freshest and most accurate data on your GA4 property. 

Looker Studio Report Data

 

Conclusion

Whether you’re reporting data or using data to improve SEO, it is important to ensure that the data is accurate and up to date. Based on our tests, once you’ve created a GA4 property or successfully migrated to GA4, you can expect to see fresh data in less than a day. Migrating from UA to GA4 can be an adjustment when it comes to data collection, so learning how/when data is collected in GA4 is a game changer when it comes to utilizing data for your campaign. 

If you want to connect your Google DataStudio reports and use your database as a DataStudio Data Source, we can do this in a few steps, here’s the overview

Having duplicate Yelp listings for your business may make it difficult for potential clients to find you and is not ideal for your Local Campaign.

If you are a business owner, but not necessarily Yelp-savvy, you might not be aware that you created duplicate listings for your business. This can happen while you’re trying to create a listing for a different location, update your business info, or change your business name – which was the case for our client.

If you have a similar issue, this tutorial can help you, or at the very least, point you in the right direction.

When merging duplicate Yelp listings, make sure you are logged in as a business owner because it will make things much easier. In the back-end, there will be a number provided: (877) 767-9357. This is the number for the Yelp Advertising department.

When I called, I was helped with my request by a Senior Account Executive. Luckily for me, the executive emailed some info regarding my request, and now I have access to his email as well as direct line for future inquiries. However, if you are not emailed by someone who works at Yelp, you should still ask for their contact info for your reference. Now, back to merging listings.

In detail, let whomever is helping you know exactly what you’re trying to achieve. In this case, I was trying to merge two duplicate listings into one and update the business name.

In this instance, since we were working with a doctor’s personal listing as well as his practice, he had to give written consent to Yelp. The executive at Yelp emailed me the following statement to forward to the doctor to sign:

I, __________ , give Yelp my consent to merge my individual Yelp listing with my practice listing. I understand that this will not be undone in the future, and I will lose out on the ability to take any photos, reviews, tips, videos, or other content with me in the future should I move locations or practices.

It took some time, but I got this back from the doctor and emailed it immediately to the Yelp executive. I did not hear back from him for a couple of weeks, so I called his direct line (again, I strongly urge you to record someone’s contact info at Yelp).

The Yelp executive said he remembered our first conversation and would complete my request right away. I checked the listings about an hour later, and…

TA-DAH! The listings had been merged.

Merging duplicate Yelp listings can actually be quite simple. The whole process should take about a week or so if you are checking on it regularly. It will take effective communication and possibly some phone calls, but you can do it too!

Google Says it’s Forbidden and it’s Bad for Clients

(So Why are Agencies Still Doing It?)

While working on our own WordPress SEO, we’ve learned that site-wide footer links have always been dicey at best (you know, “Designed by Denver SEO Company” in the footer of a website). They were easy to detect both visually and algorithmically, and like any site-wide link, they generate potentially thousands of inbound links from one IP/Website. In-content links (one or two from a website’s content rather than hundreds in the footer or sidebar) were always more desirable. TastyPlacement.com has many examples of in-content links throughout the site.

Examples of Obvious Footer Links Are Easy to Find

Here’s an example from an Inc. 5000 company that touts its work for the NFL and Canon, with a footer link on a client website bearing the simple-minded anchor text “Website Design”:

footer-link

The previous example is from an NYC design agency that ranks number 1 for “New York Web Design”.

Does Google Hate Footer Links?

Well, maybe. Google certainly has warned against it. In an October 2012 revision to its webmaster guidelines, Google warned about “[w]idely distributed links in the footers of various sites.” A valuable discussion on Webmaster World regarding footer links followed. Certainly, the use of footer links, especially when used with aggressive anchor text, should be undertaken with caution. Just as certain though is that footer links can still generate strong rankings.

Footer Links and Client Responsibility

There’s another facet though to this question, and that is the question of taking footer links on your clients’ websites. If you are a website designer or an SEO, when you take a footer link on a client website, you doing a few things:

  • You are using your superior knowledge of the search algorithms to get a link from someone who trusts you; they might not give the link so willingly if they knew all the facts and consequences.
  • You are exposing your own web property to an inbound link that violates Google’s written webmaster guidelines.
  • You are exposing your client’s website to a potential Google violation.
  • You are taking Page Rank from a client and giving it to yourself.
  • You have a more effective and safer alternative, an “About This Site” page or its equivalent–still sorta’ sleazy, but maybe not so obvious.

If you want the possible referral business that a prestige web build might generate, you can always achieve that with a simple text citation, with no link.

 

In this tutorial, we’ll learn how to block bad bots and spiders from your website. This is a standard safety measure we implement with our WordPress SEO service. We can save bandwidth and performance for customers, increase security, and prevent scrapers from putting duplicate content around the web.

Quick Start Instructions/Roadmap

For those looking to get started right away (without a lot of chit-chat), here are the steps to blocking bad bots with .htaccess:

  • FTP to your website and find your .htaccess file in your root directory
  • Create a page in your root directory called 403.html, the content of the page doesn’t matter, our is a text file with just the characters “403”
  • Browse to this page on AskApache that has a sample .htaccess snippet complete with bad bots already coded in
  • You can add any bots to the sample .htaccess file as long as you follow the .htaccess syntax rules
  • Test your .htaccess file with a bot spoofing site like wannabrowser.com

Check Your Server Logs for Bad Bots

Bad Bots Server Log

If you read your website server logs, you’ll see that bots and crawlers regularly visit your site–these visits can ultimately amount to hundreds of visits a day and plenty of bandwidth. The server log pasted above is from TastyPlacement, and the bot identified in red is discoverybot. This bot was nice enough to identify its website for me, but DiscoveryEngine.com touts itself as the next great search engine, but presently offers nothing except stolen bandwidth. It’s not a bot I want visiting my site. If you check your server logs, you might see bad bots like sitesnagger, reaper, harvest, and others.  Make a note of any suspicious bots you see in your logs.

AskApache’s Bad Bot RewriteRules

AskApache maintains a very brief tutorial but a very comprehensive .htaccess code snippet here. What’ makes that page so great is that the .htaccess snippet already has dozens of bad bots blocked (like reaper, blackwidow, sitesnagger) and you can simply add any new bots you identify.

If we want to block a bot not covered by AskApache’s default text, we just add a line to the “RewriteCond” section, separating each bot with a “|” pipe character. We’ve put “discoverybot” in our file because that’s a visitor we know we don’t want :

# IF THE UA STARTS WITH THESE
RewriteCond %{HTTP_USER_AGENT} ^(verybadbot|discoverybot) [NC,OR]

If you are on the WordPress platform be careful not to disrupt existing entries in your .htaccess file. As always, keep a backup of your .htaccess file, it’s quite easy to break your site with one coding error. Also, it’s probably better to put these rewrite rules at the beginning of your .htaccess file so no pages are served before the bots read the rewrite directives. Here’s a simplified version of the complete .htaccess file:

ErrorDocument 403 /403.html

RewriteEngine On
RewriteBase /

# IF THE UA STARTS WITH THESE
RewriteCond %{HTTP_USER_AGENT} ^(black.?hole|blackwidow|discoverybot) [NC,OR]

# ISSUE 403 / SERVE ERRORDOCUMENT
RewriteRule . - [F,L]

Here’s a translation of the .htaccess file above:

  • ErrorDocument sets a webpage titled 403.html to serve as our error document when bad bots are encountered; you want to create a page in your root directory called 403.html, the content of the page doesn’t matter, our is a text file with just the characters “403”
  • RewriteEngine and RewriteBase simple mean “ready to enforce rewrite rules, and set the base URL to the website root”
  • RewriteCond directs the server “if you encounter any of these bot names, enforce the RewriteRule that follows”
  • RewriteRule directs all bad bots identified in the text to our ErrorDocument, 403.html

 Testing Our .htaccess File

Once you upload your .htaccess file, you can test it by browsing to your site and pretending to be a bad bot. You do this by going to wannabrowser.com and spoofing a User Agent, in this case, we spoofed “SiteSnagger”:

If you installed properly, you should be directed to your 403 page, and you have successfully blocked most bad bots.

Some Limitations

Now, why don’t we do this with Robots.txt and simply tell bots not to index? Simple: because bots might simply ignore our directive, or they’ll crawl anyway and just not index the content–that’s not a fix. Even with this .htaccess fix, it’ll only block bots that identify themselves. If a bot is spoofing itself as a legitimate User Agent, then this technique won’t work. We’ll post a tutorial soon about how to block traffic based on IP address. But, that said, you’ll block 90% of bad bot traffic with this technique.

Enjoy!

Google’s Matt Cutts announced at the PubCon marketing conference that Google is rolling out a new much-anticipated link disavowal tool. Bad links from poor quality sites can harm a site’s rankings in Google, and Google has implemented this tool to let webmasters remove bad links from their link profile. TastyPlacement has been able to use this tool for clients at risk of being associated with spammy or shady sites. This feature can be extremely helpful in our WordPress SEO Service when we are identifying specific client issues.

The tool will operate by uploading a txt file that contains a list of domains that a webmaster wishes to disavow. There will also be a domain: operator that will let a webmaster disavow all links from a domain.

Here’s a sample of what a disavowal tool text file would look like:

http://www.shadysites.com/bad-post
http://www.shadysites.com/another-bad-post
domain: http://wwww.reallyshadysite.com

The tool is ostensibly live at the following domain:

https://www.google.com/webmasters/tools/disavow-links-main

Cutts warns that it’s always better to have bad links removed rather than disavow links with the tool, but recognizes that’s not always possible.

Want to be an awesome SEO or digital marketer? Seek out Those of Epic Awesomeness and learn from them. If you think like a legend in your work life, your work will be legendary. And, after what must be my 5000th listening of A Night at the Opera, I started thinking about what lessons Freddie Mercury can teach internet marketers.

Surround Yourself With Great People

Freddie Mercury had great co-workers. Guitarist Brian May played for a decade on a guitar he built with his father in the family tool shed. The guitar had an advanced tremolo system that wasn’t available at the time. Brian’s guitar fueled millions in record sales for Queen. Today, Brian owns his own guitar company that produces a commercial replica of his homemade guitar.

Queen’s bassist John Deacon began tinkering with piano in the mid-70s. Almost immediately after beginning to learn the new instrument, he crafted the now-iconic song “You’re My Best Friend.” John has said, casually, “basically that’s the song that came out you know when I was learning to play piano.” The song was a major hit, and helped push Queen’s “A Night at the Opera” album to triple-platinum sales.

Great people produce great things. You can’t do everything alone, so partner up or populate a staff with people of great talent and your work product will be great.

Appreciate Your Clients, They’ll Bring You Fame and Fortune

You are nothing without your clients (or your readers and buyers as the case may be). Freddie honored and loved his fans. “You brought me fame and fortune and everything that goes with it; I thank you all,” he insists in “We Are the Champions.” Your clients are gold. Treat them that way and you’ll enjoy riches and be a champion for years to come.

Build on the Past, but Be Original

Queen regularly employed elements of the past in their music. They drew heavily upon the growing heavy metal movement (at the time pioneered by Led Zeppelin and Black Sabbath) but also included less-expected elements like classical and opera. “It’s unheard of to combine opera with a rock theme, my dear,” Freddie once remarked. He also once said that “the whole point of Queen was to be original.” Queen knew where to ground themselves and where to branch out and be original. The resulting effect was a supernatural but radio-ready rock sound.

You can do the same. Web marketing draws on a foundation of concepts that originate with traditional advertising. Web marketers must honor well-settled advertising concepts like calls to action, customer conversion, branding, etc. After all, the psychology of the buyer hasn’t changed much since the beginning of time. But web marketing presents infinite opportunities to be original. When you blend a solid marketing foundation with creative and innovative ideas, you will excel, and you will succeed. Don’t simply imitate the successes of others, build on the successes of others with your own spin.

Be Fabulous and Think Big

“I always knew I was star, and now, the rest of the world seems to agree with me,” Freddie once mused. No one ever accused Queen or Freddie of thinking small — even before they were famous. In marketing as in rock music, it’s the big ideas that get the most attention. If you are building a website for a client or for one of your own properties, make it the best-in-class for that space. Queen wasn’t done with a song until they had lavished it with operatic 4-part harmonies. Aim high and people will love your product.

If you have an idea for a blog post article, why not go for broke and develop it into a full-scale infographic? We re-learned this lesson recently when our infographic study of social media impact on search signals went viral, earning us thousands of social media mentions and hundreds of links from the SEO and entrepreneurial community. Had we issued that material as a blog post, it might have been lost in the shuffle of thousands of other blog posts. We took the route a champion would take and it paid off.

When web content goes viral, it’s the same as when a band gets famous: when people love something, they tell their friends.

I thought this was appropriate as an introduction, from the opening evening networking event sponsored by TastyPlacement:

Pubcon Paradise 2012

Day One of Regular Pubcon Sessions:

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Evening Networking Party Sponsored by the Social Media Club of Hawaii:

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Day Two of Pubcon Regular Sessions:

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Pubcon Paradise 2012

Wrap-up Party at Jimmy Buffet’s

Pubcon Paradise 2012

Pubcon Paradise 2012

You may have heard that the iPhone’s new voice-command and personal/search assistant “Siri” will be “the end of SEO as we know it.” Undoubtedly a shift is coming, but I for one doubt it will be as disruptive as the apocolyptos might have you believe. After all, we’re not all going to use only our phones for everything. We like our laptops, and in addition, bargain hunting (AKA commercial search) is deeply ingrained in human nature.

There are a lot of fun things Siri can do including transcribing text to voice, setting reminders, playing music, checking the weather, getting directions, and yes carrying out search queries. Undoubtedly, Siri will catch on like wildfire, and as a result will compete with many apps and tools, including search engines.

Optimizing for Siri

The integration of Siri will begin to affect strategies and optimization efforts, but most of these things should be part of an immersive SEO program from the start.

Local Search for Siri

People search from mobile devices on the move; they’re not sitting down to do in-depth research. A majority of mobile searches are location-specific including directions, finding nearby restaurants, or other local services.

With Siri, it’s not about people getting to your website through Google placement alone because visibility comes from other sources. Siri wants to give users a visual experience and draws data from local listing sites such as Yelp, Google Maps, Citysearch, YP, etc. There are more than 60 of these sites on which it is well worth your time to create a listing. It’s not just for Siri, getting listed on (and links from) all these sites improves local listing and organic placements in SERPs as well.

Obviously, you’ll want your information to be correct, up to date, and fully filled out on these sites with accurate address, phone number, images, positive reviews, and a high number of ratings. For more info on local optimization, check out our post on local listings SEO.

Rich Snippets and Schema Tags

Schema.org lets you use a specific markup language (web code) to identify specific information about your business and web presence and make that information more easily found by search engines.

Search engines are using on-page tags in a variety of ways. Google uses them to create rich snippets in search results and will continue to do so more and more. These snippets include author information, address, phone number, operating hours, and so on. So you can see how these tags have value to local searches such as are the focus of Siri. Offering a highly structured format for this information makes it that much easier to be found.

Variety in Linkbuilding and Long Tail Keywords

This is the first version of Siri, and its depth of language capabilities will continue to increase with new versions. Therefore the following effect will only continue to grow. Already, the length of Siri queries are longer because users are searching in natural speech rather than pecking away at keyboards or small iPhone touchscreens.

The result is more long-tail and highly targeted searches. Optimizing for long-tail means more words on the page and more flexible link building. Both of these strategies work in organic search as well, so you won’t even have to duplicate your efforts.

It used to be that you chose your anchor text and could simply bang away at it over and over. With enough links, you’d move on up. That hasn’t been best practice for a while, and Google is becoming even more focused on natural-looking anchor text profiles. Not only is this a safety-first method, but it’s also more efficient. Flexible anchor text (anchor text with the keyword integrated here and there, but also broadly varied) is more efficient in increasing rankings, even for the targeted, high-volume terms.

Back to Siri, the efforts you make to naturalize and get the most out of your link profile will also help you rank for long-tail searches, which Siri is all about. As a bonus, long tail searches are more targeted to the specific needs of a given search query and therefore convert at higher rates.

The iPhone 4s (S is for Siri? Seems that way to me…) is Apple’s best-selling phone to date, with 4 million sales in three days. Verizon started carrying the iPhone earlier this year and even Sprint has had no choice but to jump on the bandwagon. It’s a monolith, and it’s the impetus for a new fold of search optimization.

Today, Google offered a new feature to enhance its already-robust search capabilities: Secure Search. Google is the first major search engine to offer search in a secure setting.
Pictured below, the secure search works and looks just like traditional search, but operates on SSL (secure socket layer), and can be found here: https://www.google.com. The “https” references a secure internet browsing location, unlike traditional “http” locations:

What Does it Do?

Google’s secure search protects the transmission of data between a user’s computer and Google’s server. So, a user searching for “ways to pass a drug screening” would enjoy enhanced security for that search: the user’s search query could not be intercepted in transit by persons snooping on internet traffic–which in an unsecured environment, is largely open to viewing by anyone. It’s the same technology that protects the transmission of credit card numbers.

However, it has limitations: the user’s browser settings may retain the search query, making it visible to coworkers, spouses, or anyone with access to the physical computer on which the search was made.

No Change in Search Results

But will this new feature change search results, the order in which search results appear? No. TastyPlacement tested a variety of phrases in both environments, and the search results are unchanged.

Who Won’t Like It?

That’s easy: the Chinese Government and the North Korean Government. These governments snoop on their citizens by intercepting all sorts of internet traffic–including search queries. The secure connection means that a dissident in China can search for information without having his or her search queries read by China’s ubiquitous internet police. Of course, once a person clicks on a link–the visit to the destination website will be visible to snoopers. And, of course, the Chinese government can simply attempt to block access to all of Google’s servers.