Featured

First blog post

This is the post excerpt.

Advertisements

This is your very first post. Click the Edit link to modify or delete it, or start a new post. If you like, use this post to tell readers why you started this blog and what you plan to do with it.

post

What Is Whitespace? 9 Websites to Inspire Your Web Design

Empty space is not always wasted space.

In fact, when it comes to web design, it’s a best practice to give your content a little breathing room.

Today’s website visitors are content-scanners. They scroll quickly, skim posts, and get distracted by busy layouts trying to accomplish too much. The key to getting your visitors’ undivided attention is simplicity — and that starts with an effective use of whitespace.

In this article, we’ll take a brief look at why whitespace matters, what it means for conversion-driven web design, and how eight websites are using whitespace to lead their visitors towards the desired action.

What Is Whitespace?

Whitespace is the negative areas in any composition. It’s the unmarked distance between different elements that gives viewers some visual breaks when they process design, minimizing distractions and making it easier to focus.

Intentionally blank areas aren’t just aesthetically pleasing — they actually have a big impact on how our brains take in and process new material. Too much information or visual data crammed into a small, busy space can cause cognitive fatigue, and our brains have difficulty absorbing anything at all. It’s information overload at its very worst.

Why We Need Whitespace

To understand the importance of whitespace, think about how difficult it is for your brain to process an entire page from the phone book or white pages. All those columns of teeny tiny text get squished together into one indigestible chunk of information, and it can be a real challenge to find what you’re looking for.

While phone books are designed to display maximum information in minimum space, the majority of print layouts are created to be more easily understood — thanks to whitespace.

To illustrate how effective whitespace is at helping our brains process information in print, check out the example below from Digital Ink:

See the difference? The layout on the left uses the vast majority of available space, but it looks crowded and severe — not exactly something you’d feel comfortable staring at for a long time to read.

In contrast, the layout on the right uses wider columns and more distance between paragraphs. It’s a simple design shift that has a major impact on making the article look more approachable and readable.

In addition to making layouts easier to understand, whitespace can also place emphasis on specific elements, helping the viewer understand what they should focus on. Using whitespace to break up a layout and group particular things together helps create a sense of balance and sophistication.

Take a look at this business card example from Printwand:

The business card on the left does include negative space, but the elements are still crammed into one area, making the whole card look cluttered and unprofessional. The card on the right uses whitespace to a better effect, spacing the individual elements out so the composition is easier to make sense of.

When it comes to designing websites, whitespace is crucial — not only from an aesthetic standpoint, but also from a conversion optimization perspective. Using whitespace effectively can make your website more easily navigable, comprehensible, and conversion-friendly, directing users more smoothly to call-to-actions and encouraging them to convert.

In fact, classic research by Human Factors International found that using whitespace to highlight or emphasize important elements on a website increased visitor comprehension by almost 20%.

Just take a look at these two website layouts:

On the left, the call-to-action button has no room to breathe — it’s wedged between busy dividers and tightly packed text. There’s too much distraction around the button, making it difficult for visitors to focus on what matters.

On the right, the call-to-action has been padded with some much-needed whitespace. The button now appears to be a focal point on the page, encouraging visitors to stop and take notice.

You’ll notice that adding some whitespace around our call-to-action has caused some of the other content on the page to be pushed down — and that’s perfectly okay. Not everything has to be above the fold (the part of the website that appears before the user starts to scroll). In fact, designers shouldn’t try to stuff a ton of content before the fold of the page, since it will end up looking cluttered and overwhelming.

9 Websites Using Whitespace Marketing to Their Advantage

1) Shopify

The homepage for ecommerce platform Shopify has a simple objective: Get visitors to sign up for a free trial.

To direct users to this action, they’ve surrounded their one-field sign-up form with plenty of whitespace, minimizing distractions and ensuring visitors can’t miss it. The site’s main navigation is displayed much smaller than the form text, and placed out of the way at the top of the screen to avoid taking attention away from the central form.

Screen Shot 2017-10-10 at 2.29.33 PM

2) Everlane

Whitespace doesn’t have to mean the complete absence of color or pictures — it means making sure page elements are generously and strategically spaced to avoid overwhelming or confusing your visitors.

To show off its latest clothing collection, fashion retailer Everlane opts for a minimal set up: The full page background shows off a photograph of its “GoWeave” blazer, and a small, expertly placed call-to-action appears in the center of the screen, encouraging users to click and “shop now.” It’s a perfect example of leading users towards an action without being pushy or aggressive. 

Screen Shot 2017-10-10 at 2.31.35 PM

3) Wistia

Using whitespace strategically can be as easy as making sure your forms and call-to-action buttons are noticeably separated from the rest of your content. This simple change makes a huge difference in how your content is perceived. 

Wistia, a video platform, anchors their homepage with a friendly question and a drop-down form. The two central CTA buttons serve as the central focal point(s) of the whole page, and it’s given plenty of space to set it apart from the site’s main navigation and image.

Screen Shot 2017-10-10 at 2.36.41 PM

4) Welikesmall

Digital agency Welikesmall proves that whitespace doesn’t have to be boring, empty, or even static. Their homepage displays a fullscreen demo reel of their recent video projects, filtering through a variety of exciting vignettes to immediately capture the visitor’s attention. 

Full-screen video in any other context could seem busy and aggressive, but since the layout is designed with generous whitespace, it looks polished. With all the focus on the video background, the text is kept minimal. The agency’s logo appears in one corner, and a folded hamburger style menu appears in the other. Welikesmall’s slogan — “Belief in the Making” — is fixed in the center of the screen, along with a call-to-action button linking to the agency’s full 2016 demo reel.  

Screen Shot 2017-10-10 at 2.37.24 PM

5) Simpla

This homepage from Simpla demonstrates the power that a relatively empty above the fold section can have. This simple, decidedly minimal homepage uses whitespace to urge users to keep scrolling.

Beneath the logo and navigation, a large portion of the site has been left unmarked. The top of a photo — along with a short paragraph of text and an arrow — invites visitors to keep reading to learn more about the company and their mission.

This unique use of whitespace not only looks sophisticated, but it strategically draws visitors further into the site. 

6) Harvard Art Museums

The Harvard Art Museums might be known for displaying antiquated paintings, but their homepage is decidedly modern. The whitespace here provides the perfect backdrop for the featured art, making sure that nothing distracts from the pieces themselves. It’s about as close to a digital art exhibition as you can get. 

The masonry-style layout gives the user a reason to keep scrolling, and also ensures that none of the images are crowded together. To maintain the minimal gallery aesthetic, the site’s navigation is completely hidden until the user hovers their mouse towards the top of the page.

Screen Shot 2017-10-10 at 2.39.02 PM

7) Burnkit

When working with whitespace on your homepage, you’ll have to make some tough decisions about what’s important enough to display, since there’s less room for a pile of cluttered content. This design agency shows us that you can display a wide variety of content in a minimal layout, without squishing things together and muddying the composition. 

Burnkit‘s homepage features blog content, key excerpts from the agency’s portfolio of client work, and behind-the-scenes looks at the agency’s culture. So how did they manage to fit so much onto one page without overwhelming the visitor? Whitespace. Lots and lots of whitespace. Each article is given generous padding, and the user can keep scrolling to reveal new material. 

Screen Shot 2017-10-10 at 2.40.43 PM

8) Medium

Medium cleverly uses whitespace to get readers to keep scrolling further down the page by enticing them with notes showing how many people have “clapped” for a post, how many people have commented on it, and what related content is next on the docket for them to read.

The whitespace pushes the reader to look at the center column of their screen, featuring a compelling title and cover photo — and uses social proof to show readers why they should keep scrolling.

medium-whitespace.png

10) Ahrefs

Ahrefs‘ website is another example of whitespace that decidedly isn’t white, and its homepage uses both whitespace and text formatting to focus the visitor’s eyes on the glowing orange button — to start their free trial.

In bold, large font, Ahrefs offers its software’s value proposition, and in smaller, center-justified text, it uses whitespace to guide the viewer to click the CTA button. Smart, right?

ahrefs-whitespace.png

Laura Crimmons on link building

With only a few weeks left until YoastCon 2017, it’s time we introduced another of our amazing speakers. Laura Crimmons is Communications Director at Branded3, an award-winning SEO and digital marketing agency in the UK. Laura herself also has an admirable amount of achievements and awards under her belt, for example being named PR Moment’s Young Professional of the Year 2017. At YoastCon, she’s going to talk about link building in a successful online campaign, and successfully structuring a link building campaign. We asked her a few questions about links and link building to give you a little preview!

Don’t miss the opportunity to see Laura in action! Get your ticket now for YoastCon 2017!
Tickets

Come see Laura Crimmons at YoastCon 2017 on November 2 »banner YoastCon

Tell us a bit about yourself and your background. How did you end up at Branded3? And what is the accomplishment you are most proud of while working at that agency?

My background prior to joining Branded3 was in PR; I did a PR degree and had some experience in more traditional PR agencies and in-house roles but it was always digital that appealed to me, so I decided to join Branded3. I joined about a week after manual penalties and Penguin first rolled out, so it was at a time when the agency (and the SEO industry as a whole) was trying to find its feet, with how to build links now that the old ways were (rightfully in many cases) being penalized. Thankfully PR seemed to be part of that solution.

You focus a lot on link building for larger clients. Link building, of course, is a science in itself. Could you share your tactics for starting a – hopefully – successful link building project?

The starting point always has to be the audience, and plenty of research. You need to understand:

  • Who is my audience?
  • Where do they hang out online?
  • What are they interested in?

From here you’re able to start brainstorming ideas that will engage the audience. At this point you should also have started to develop a list of sites that will be your targets for link acquisition.

Links are still incredibly important, even in this day and age. Anyone is looking for high-value links from relevant sites in their industry. What are your favorite tips for getting these kinds of quality links?

We use PR as a way to generate these kinds of links i.e. working with journalists who usually work for higher quality sites (publishers) than say bloggers who would generally have lower quality domains.

That said, there are lots of other high quality sites that you can attract links from without PR, for example by looking at any genuine resource sites in your industry that link to competitors but not you.

Every site-owner needs to gather links and local business owners would probably benefit even more for good links. Could you explain the impact of link building for local SEO?

Link building is important for local SEO in the same sense that it is for any SEO, however, when specifically looking at local SEO we place more emphasis on citations, data accuracy and proximity.

Do you see the importance of links changing anytime soon?

We all know that search engines have been trying for years to decrease their reliance on links as a ranking factor. But they haven’t got there yet and I don’t necessarily see that happening in the next year or so.

Even if they do manage to find a way to determine a site’s authority better than links, I still think the practice of Digital PR/Content Marketing that we do now for link acquisition will remain important, as it goes beyond just acquiring links. It’s about building brand awareness, affinity and ultimately does play a part in assisted conversions.

Why shouldn’t people miss your talk at YoastCon?

I’ve spent the last five and a half years working in link acquisition and have had a lot of success over that time gaining links from some of the biggest publishers in the UK and globally. So anyone that wants to up their game or learn some tips would probably take something away from it.

Get your ticket for YoastCon 2017 now!
Tickets

Read more: ‘YoastCon 2017: Practical SEO’ »

The post Laura Crimmons on link building appeared first on Yoast.

Google Shares Details About the Technology Behind Googlebot

Posted by goralewicz

Crawling and indexing has been a hot topic over the last few years. As soon as Google launched Google Panda, people rushed to their server logs and crawling stats and began fixing their index bloat. All those problems didn’t exist in the “SEO = backlinks” era from a few years ago. With this exponential growth of technical SEO, we need to get more and more technical. That being said, we still don’t know how exactly Google crawls our websites. Many SEOs still can’t tell the difference between crawling and indexing.

The biggest problem, though, is that when we want to troubleshoot indexing problems, the only tool in our arsenal is Google Search Console and the Fetch and Render tool. Once your website includes more than HTML and CSS, there’s a lot of guesswork into how your content will be indexed by Google. This approach is risky, expensive, and can fail multiple times. Even when you discover the pieces of your website that weren’t indexed properly, it’s extremely difficult to get to the bottom of the problem and find the fragments of code responsible for the indexing problems.

Fortunately, this is about to change. Recently, Ilya Grigorik from Google shared one of the most valuable insights into how crawlers work:

Interestingly, this tweet didn’t get nearly as much attention as I would expect.

So what does Ilya’s revelation in this tweet mean for SEOs?

Knowing that Chrome 41 is the technology behind the Web Rendering Service is a game-changer. Before this announcement, our only solution was to use Fetch and Render in Google Search Console to see our page rendered by the Website Rendering Service (WRS). This means we can troubleshoot technical problems that would otherwise have required experimenting and creating staging environments. Now, all you need to do is download and install Chrome 41 to see how your website loads in the browser. That’s it.

You can check the features and capabilities that Chrome 41 supports by visiting Caniuse.com or Chromestatus.com (Googlebot should support similar features). These two websites make a developer’s life much easier.

Even though we don’t know exactly which version Ilya had in mind, we can find Chrome’s version used by the WRS by looking at the server logs. It’s Chrome 41.0.2272.118.

It will be updated sometime in the future

Chrome 41 was created two years ago (in 2015), so it’s far removed from the current version of the browser. However, as Ilya Grigorik said, an update is coming:

I was lucky enough to get Ilya Grigorik to read this article before it was published, and he provided a ton of valuable feedback on this topic. He mentioned that they are hoping to have the WRS updated by 2018. Fingers crossed!

Google uses Chrome 41 for rendering. What does that mean?

We now have some interesting information about how Google renders websites. But what does that mean, practically, for site developers and their clients? Does this mean we can now ignore server-side rendering and deploy client-rendered, JavaScript-rich websites?

Not so fast. Here is what Ilya Grigorik had to say in response to this question:

We now know WRS’ capabilities for rendering JavaScript and how to debug them. However, remember that not all crawlers support Javascript crawling, etc. Also, as of today, JavaScript crawling is only supported by Google and Ask (Ask is most likely powered by Google). Even if you don’t care about social media or search engines other than Google, one more thing to remember is that even with Chrome 41, not all JavaScript frameworks can be indexed by Google (read more about JavaScript frameworks crawling and indexing). This lets us troubleshoot and better diagnose problems.

Don’t get your hopes up

All that said, there are a few reasons to keep your excitement at bay.

Remember that version 41 of Chrome is over two years old. It may not work very well with modern JavaScript frameworks. To test it yourself, open http://jsseo.expert/polymer/ using Chrome 41, and then open it in any up-to-date browser you are using.

The page in Chrome 41 looks like this:

The content parsed by Polymer is invisible (meaning it wasn’t processed correctly). This is also a perfect example for troubleshooting potential indexing issues. The problem you’re seeing above can be solved if diagnosed properly. Let me quote Ilya:

“If you look at the raised Javascript error under the hood, the test page is throwing an error due to unsupported (in M41) ES6 syntax. You can test this yourself in M41, or use the debug snippet we provided in the blog post to log the error into the DOM to see it.”

I believe this is another powerful tool for web developers willing to make their JavaScript websites indexable. We will definitely expand our experiment and work with Ilya’s feedback.

The Fetch and Render tool is the Chrome v. 41 preview

There’s another interesting thing about Chrome 41. Google Search Console’s Fetch and Render tool is simply the Chrome 41 preview. The righthand-side view (“This is how a visitor to your website would have seen the page”) is generated by the Google Search Console bot, which is… Chrome 41.0.2272.118 (see screenshot below).

Zoom in here

There’s evidence that both Googlebot and Google Search Console Bot render pages using Chrome 41. Still, we don’t exactly know what the differences between them are. One noticeable difference is that the Google Search Console bot doesn’t respect the robots.txt file. There may be more, but for the time being, we’re not able to point them out.

Chrome 41 vs Fetch as Google: A word of caution

Chrome 41 is a great tool for debugging Googlebot. However, sometimes (not often) there’s a situation in which Chrome 41 renders a page properly, but the screenshots from Google Fetch and Render suggest that Google can’t handle the page. It could be caused by CSS animations and transitions, Googlebot timeouts, or the usage of features that Googlebot doesn’t support. Let me show you an example.

Chrome 41 preview:

Image blurred for privacy

The above page has quite a lot of content and images, but it looks completely different in Google Search Console.

Google Search Console preview for the same URL:

As you can see, Google Search Console’s preview of this URL is completely different than what you saw on the previous screenshot (Chrome 41). All the content is gone and all we can see is the search bar.

From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.

This leads us to a question…

What features are supported by Googlebot and WRS?

According to the Rendering on Google Search guide:

  • Googlebot doesn’t support IndexedDB, WebSQL, and WebGL.
  • HTTP cookies and local storage, as well as session storage, are cleared between page loads.
  • All features requiring user permissions (like Notifications API, clipboard, push, device-info) are disabled.
  • Google can’t index 3D and VR content.
  • Googlebot only supports HTTP/1.1 crawling.

The last point is really interesting. Despite statements from Google over the last 2 years, Google still only crawls using HTTP/1.1.

No HTTP/2 support (still)

We’ve mostly been covering how Googlebot uses Chrome, but there’s another recent discovery to keep in mind.

There is still no support for HTTP/2 for Googlebot.

Since it’s now clear that Googlebot doesn’t support HTTP/2, this means that if your website supports HTTP/2, you can’t drop HTTP 1.1 optimization. Googlebot can crawl only using HTTP/1.1.

There were several announcements recently regarding Google’s HTTP/2 support. To read more about it, check out my HTTP/2 experiment here on the Moz Blog.

Via https://developers.google.com/search/docs/guides/r…

Googlebot’s future

Rumor has it that Chrome 59’s headless mode was created for Googlebot, or at least that it was discussed during the design process. It’s hard to say if any of this chatter is true, but if it is, it means that to some extent, Googlebot will “see” the website in the same way as regular Internet users.

This would definitely make everything simpler for developers who wouldn’t have to worry about Googlebot’s ability to crawl even the most complex websites.

Chrome 41 vs. Googlebot’s crawling efficiency

Chrome 41 is a powerful tool for debugging JavaScript crawling and indexing. However, it’s crucial not to jump on the hype train here and start launching websites that “pass the Chrome 41 test.”

Even if Googlebot can “see” our website, there are many other factors that will affect your site’s crawling efficiency. As an example, we already have proof showing that Googlebot can crawl and index JavaScript and many JavaScript frameworks. It doesn’t mean that JavaScript is great for SEO. I gathered significant evidence showing that JavaScript pages aren’t crawled even half as effectively as HTML-based pages.

In summary

Ilya Grigorik’s tweet sheds more light on how Google crawls pages and, thanks to that, we don’t have to build experiments for every feature we’re testing — we can use Chrome 41 for debugging instead. This simple step will definitely save a lot of websites from indexing problems, like when Hulu.com’s JavaScript SEO backfired.

It’s safe to assume that Chrome 41 will now be a part of every SEO’s toolset.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Does Googlebot Support HTTP/2? Challenging Google’s Indexing Claims – An Experiment

Posted by goralewicz

I was recently challenged with a question from a client, Robert, who runs a small PR firm and needed to optimize a client’s website. His question inspired me to run a small experiment in HTTP protocols. So what was Robert’s question? He asked…

Can Googlebot crawl using HTTP/2 protocols?

You may be asking yourself, why should I care about Robert and his HTTP protocols?

As a refresher, HTTP protocols are the basic set of standards allowing the World Wide Web to exchange information. They are the reason a web browser can display data stored on another server. The first was initiated back in 1989, which means, just like everything else, HTTP protocols are getting outdated. HTTP/2 is one of the latest versions of HTTP protocol to be created to replace these aging versions.

So, back to our question: why do you, as an SEO, care to know more about HTTP protocols? The short answer is that none of your SEO efforts matter or can even be done without a basic understanding of HTTP protocol. Robert knew that if his site wasn’t indexing correctly, his client would miss out on valuable web traffic from searches.

The hype around HTTP/2

HTTP/1.1 is a 17-year-old protocol (HTTP 1.0 is 21 years old). Both HTTP 1.0 and 1.1 have limitations, mostly related to performance. When HTTP/1.1 was getting too slow and out of date, Google introduced SPDY in 2009, which was the basis for HTTP/2. Side note: Starting from Chrome 53, Google decided to stop supporting SPDY in favor of HTTP/2.

HTTP/2 was a long-awaited protocol. Its main goal is to improve a website’s performance. It’s currently used by 17% of websites (as of September 2017). Adoption rate is growing rapidly, as only 10% of websites were using HTTP/2 in January 2017. You can see the adoption rate charts here. HTTP/2 is getting more and more popular, and is widely supported by modern browsers (like Chrome or Firefox) and web servers (including Apache, Nginx, and IIS).

Its key advantages are:

  • Multiplexing: The ability to send multiple requests through a single TCP connection.
  • Server push: When a client requires some resource (let’s say, an HTML document), a server can push CSS and JS files to a client cache. It reduces network latency and round-trips.
  • One connection per origin: With HTTP/2, only one connection is needed to load the website.
  • Stream prioritization: Requests (streams) are assigned a priority from 1 to 256 to deliver higher-priority resources faster.
  • Binary framing layer: HTTP/2 is easier to parse (for both the server and user).
  • Header compression: This feature reduces overhead from plain text in HTTP/1.1 and improves performance.

For more information, I highly recommend reading “Introduction to HTTP/2” by Surma and Ilya Grigorik.

All these benefits suggest pushing for HTTP/2 support as soon as possible. However, my experience with technical SEO has taught me to double-check and experiment with solutions that might affect our SEO efforts.

So the question is: Does Googlebot support HTTP/2?

Google’s promises

HTTP/2 represents a promised land, the technical SEO oasis everyone was searching for. By now, many websites have already added HTTP/2 support, and developers don’t want to optimize for HTTP/1.1 anymore. Before I could answer Robert’s question, I needed to know whether or not Googlebot supported HTTP/2-only crawling.

I was not alone in my query. This is a topic which comes up often on Twitter, Google Hangouts, and other such forums. And like Robert, I had clients pressing me for answers. The experiment needed to happen. Below I’ll lay out exactly how we arrived at our answer, but here’s the spoiler: it doesn’t. Google doesn’t crawl using the HTTP/2 protocol. If your website uses HTTP/2, you need to make sure you continue to optimize the HTTP/1.1 version for crawling purposes.

The question

It all started with a Google Hangouts in November 2015.

When asked about HTTP/2 support, John Mueller mentioned that HTTP/2-only crawling should be ready by early 2016, and he also mentioned that HTTP/2 would make it easier for Googlebot to crawl pages by bundling requests (images, JS, and CSS could be downloaded with a single bundled request).

“At the moment, Google doesn’t support HTTP/2-only crawling (…) We are working on that, I suspect it will be ready by the end of this year (2015) or early next year (2016) (…) One of the big advantages of HTTP/2 is that you can bundle requests, so if you are looking at a page and it has a bunch of embedded images, CSS, JavaScript files, theoretically you can make one request for all of those files and get everything together. So that would make it a little bit easier to crawl pages while we are rendering them for example.”

Soon after, Twitter user Kai Spriestersbach also asked about HTTP/2 support:

His clients started dropping HTTP/1.1 connections optimization, just like most developers deploying HTTP/2, which was at the time supported by all major browsers.

After a few quiet months, Google Webmasters reignited the conversation, tweeting that Google won’t hold you back if you’re setting up for HTTP/2. At this time, however, we still had no definitive word on HTTP/2-only crawling. Just because it won’t hold you back doesn’t mean it can handle it — which is why I decided to test the hypothesis.

The experiment

For months as I was following this online debate, I still received questions from our clients who no longer wanted want to spend money on HTTP/1.1 optimization. Thus, I decided to create a very simple (and bold) experiment.

I decided to disable HTTP/1.1 on my own website (https://goralewicz.com) and make it HTTP/2 only. I disabled HTTP/1.1 from March 7th until March 13th.

If you’re going to get bad news, at the very least it should come quickly. I didn’t have to wait long to see if my experiment “took.” Very shortly after disabling HTTP/1.1, I couldn’t fetch and render my website in Google Search Console; I was getting an error every time.

My website is fairly small, but I could clearly see that the crawling stats decreased after disabling HTTP/1.1. Google was no longer visiting my site.

While I could have kept going, I stopped the experiment after my website was partially de-indexed due to “Access Denied” errors.

The results

I didn’t need any more information; the proof was right there. Googlebot wasn’t supporting HTTP/2-only crawling. Should you choose to duplicate this at home with our own site, you’ll be happy to know that my site recovered very quickly.

I finally had Robert’s answer, but felt others may benefit from it as well. A few weeks after finishing my experiment, I decided to ask John about HTTP/2 crawling on Twitter and see what he had to say.

(I love that he responds.)

Knowing the results of my experiment, I have to agree with John: disabling HTTP/1 was a bad idea. However, I was seeing other developers discontinuing optimization for HTTP/1, which is why I wanted to test HTTP/2 on its own.

For those looking to run their own experiment, there are two ways of negotiating a HTTP/2 connection:

1. Over HTTP (unsecure) – Make an HTTP/1.1 request that includes an Upgrade header. This seems to be the method to which John Mueller was referring. However, it doesn’t apply to my website (because it’s served via HTTPS). What is more, this is an old-fashioned way of negotiating, not supported by modern browsers. Below is a screenshot from Caniuse.com:

2. Over HTTPS (secure) – Connection is negotiated via the ALPN protocol (HTTP/1.1 is not involved in this process). This method is preferred and widely supported by modern browsers and servers.

A recent announcement: The saga continues

Googlebot doesn’t make HTTP/2 requests

Fortunately, Ilya Grigorik, a web performance engineer at Google, let everyone peek behind the curtains at how Googlebot is crawling websites and the technology behind it:

If that wasn’t enough, Googlebot doesn’t support the WebSocket protocol. That means your server can’t send resources to Googlebot before they are requested. Supporting it wouldn’t reduce network latency and round-trips; it would simply slow everything down. Modern browsers offer many ways of loading content, including WebRTC, WebSockets, loading local content from drive, etc. However, Googlebot supports only HTTP/FTP, with or without Transport Layer Security (TLS).

Googlebot supports SPDY

During my research and after John Mueller’s feedback, I decided to consult an HTTP/2 expert. I contacted Peter Nikolow of Mobilio, and asked him to see if there were anything we could do to find the final answer regarding Googlebot’s HTTP/2 support. Not only did he provide us with help, Peter even created an experiment for us to use. Its results are pretty straightforward: Googlebot does support the SPDY protocol and Next Protocol Navigation (NPN). And thus, it can’t support HTTP/2.

Below is Peter’s response:


I performed an experiment that shows Googlebot uses SPDY protocol. Because it supports SPDY + NPN, it cannot support HTTP/2. There are many cons to continued support of SPDY:

    1. This protocol is vulnerable
    2. Google Chrome no longer supports SPDY in favor of HTTP/2
    3. Servers have been neglecting to support SPDY. Let’s examine the NGINX example: from version 1.95, they no longer support SPDY.
    4. Apache doesn’t support SPDY out of the box. You need to install mod_spdy, which is provided by Google.

To examine Googlebot and the protocols it uses, I took advantage of s_server, a tool that can debug TLS connections. I used Google Search Console Fetch and Render to send Googlebot to my website.

Here’s a screenshot from this tool showing that Googlebot is using Next Protocol Navigation (and therefore SPDY):

I’ll briefly explain how you can perform your own test. The first thing you should know is that you can’t use scripting languages (like PHP or Python) for debugging TLS handshakes. The reason for that is simple: these languages see HTTP-level data only. Instead, you should use special tools for debugging TLS handshakes, such as s_server.

Type in the console:

sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -WWW -tlsextdebug -state -msg
sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -www -tlsextdebug -state -msg

Please note the slight (but significant) difference between the “-WWW” and “-www” options in these commands. You can find more about their purpose in the s_server documentation.

Next, invite Googlebot to visit your site by entering the URL in Google Search Console Fetch and Render or in the Google mobile tester.

As I wrote above, there is no logical reason why Googlebot supports SPDY. This protocol is vulnerable; no modern browser supports it. Additionally, servers (including NGINX) neglect to support it. It’s just a matter of time until Googlebot will be able to crawl using HTTP/2. Just implement HTTP 1.1 + HTTP/2 support on your own server (your users will notice due to faster loading) and wait until Google is able to send requests using HTTP/2.


Summary

In November 2015, John Mueller said he expected Googlebot to crawl websites by sending HTTP/2 requests starting in early 2016. We don’t know why, as of October 2017, that hasn’t happened yet.

What we do know is that Googlebot doesn’t support HTTP/2. It still crawls by sending HTTP/ 1.1 requests. Both this experiment and the “Rendering on Google Search” page confirm it. (If you’d like to know more about the technology behind Googlebot, then you should check out what they recently shared.)

For now, it seems we have to accept the status quo. We recommended that Robert (and you readers as well) enable HTTP/2 on your websites for better performance, but continue optimizing for HTTP/ 1.1. Your visitors will notice and thank you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

7 of the Best Promotional Product Videos Ever

One of the wisest things I’ve ever read about product marketing came from the writer of a children’s book.

“If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, teach them to long for the endless immensity of the sea,” said Antoine de Saint-Exupéry, creator of The Little Prince.

The goal in crafting a perfect product video is not far off. If you want your video to resonate, it should be about more than just the product. It should be about the problem, the solution, the experience, and the larger vision of what you’re trying to build.

Considering the right video can put a product on the map for the first time or reinvigorate a company that has long been stale, it’s important that marketers have a strong grasp on this. So to inspire your own efforts, we’ve collected a list of impressive product videos for marketing a product or new release.

What Is a Product Video?

Product Videos Defined

A product video is one that explains and visually exhibits a product’s tangible benefits. A lot of product videos tend to emphasize a product’s unique features, but one chief thing that truly differentiates a good product video is its ability to exhibit how it solves problems.

What Makes a Good Product Video?

As a rule, remarkable product videos encompass the following:

  • Engaging dialogue and narration
  • Long enough to fully explain the product and its benefits, but short enough to keep the viewer’s attention
  • Professionalism, without being “stuffy”
  • Empathy and relatability

Want to see how these elements are put into action? Check out the examples below.

7 Promotional Product Videos That Make You Want to Buy

1) Blendtec: Will it Blend?

I’m digging into the archives for this one, but in the world of videos that add life to a product, few have done it better than Blendtec. The company’s CEO, Tom Dickson, became a YouTube icon back in 2006 with the introduction of his Will it Blend? series.

Since then, Blendtec has expanded the tremendous success of these videos to other channels, enabling viewers to suggest things to blend on Facebook. The company even has its own Wikipedia page dedicated to the series.

The success of this video comes down to two things: a clear, unwavering message and a company with a personality. In seven years, the series has never changed. The point of each video and the underpinning of the product positioning is essentially, “Why yes, it will blend.”

For years, we’ve been watching this product blend everything from glow sticks to an iPhone. The videos are minimally expensive, product-focused, and garner millions of views. In a recent interview, Dickson explained the history and success of the video series:

“‘Will it Blend?’ was developed accidentally by a new marketing director hired in 2006. I have always been one to try to break my blenders to find their fail points and determine how I can improve them. George, the new marketing director, discovered some of the wacky things I was doing to my blenders … With a $50 budget, George bought a Happy Meal, a rotisserie chicken, Coke cans, golf balls, and a few other items, and they made five videos. Six days later, we had six million views on YouTube. Six years, 120-plus videos, almost 200 million views later, ‘Will it Blend?’ has been named as the number one viral marketing campaign of all time [by Ad Age].”

Here’s Tom blending a Facebook request: Justin Bieber. The video earned 2.8 million views (and counting) on YouTube.

2) Dollar Shave Club: Our Blades are Great

Dollar Shave Club also made waves with their first product video. I’ll warn you now: they’re not shy with the F-bombs or referring to “your handsome-ass grandfather,” so you may want to throw in the headphones before pressing play. Having said that, what’s singular about this product launch video is how well the company knows its audience and the problem it’s trying to solve.

Dollar Shave Club was trying to crack into a demographic of young, professional men who habitually purchase big-brand razors at local stores. The problem they attempt to highlight is the absurdly high cost of store-bought razor cartridges. Thus, the company needed an absurdist, well-targeted product launch video to match.

CEO Michael Dubin, who studied improv with the Upright Citizens Brigade, wrote the spot himself and hired a comedian friend, Lucia Aniello, to produce the video. According to reports on Quora, the video cost approximately $4,500 — and yet, it got more than 11 million views and coverage on countless media outlets.

3) Purple Feather: The Power of Words

In tight marketing budgets, professional copywriting services are often the first to be cut. Instead of hiring professional copywriters, companies opt to take on the writing themselves, figuring it’s not all that different from other writing they do. They assume the words they choose won’t make much of a difference one way or the other. Based in Glasgow, Purple Feather is a copywriting agency that set out to prove that assumption wrong.

Words matter. In fact, they can change everything. Purple Feather made that point exceptionally clear in this powerful video:

4) Google Chrome: Jess Time

The best product videos focus not on the product itself, but on the stories of the people who use it.

Technology writer and NYU Professor Clay Shirky has a great chapter in his first book about the pervasiveness of communications tools in our lives. In it, he explains that technology doesn’t truly get interesting until it becomes so ingrained in our lives it turns invisible. No product video shows this “invisibility” of really good products better than Google’s “The Web is What You Make It” series.

The video below demonstrates how seamlessly Google and all of its products have melded into our lives and become a part of how we interact. It’s a video about an experience, not software, and that is arguably what the company truly creates.

5) Apple: The Only Thing That’s Changed

Launch videos like the Dollar Shave Club video above have a bit of an advantage when it comes to resonating with an audience. They represent a brand new company, product, or idea. But what if your company has been around for a long time? What if the announcement you’re making is really more of a set of enhancements to an existing product than a brand new launch?

This year, Apple tackled that challenge head-on with the following video. This video takes a collection of seemingly small enhancements and strings them together in a way that underscores just how advanced the total new functionality is. Take a look:

6) Google: Google, Evolved

This year Google introduced a new logo for the company and a new parent company, Alphabet. It was the perfect moment for retrospection. So the company took to video to show not only how much Google’s products have evolved, but how much progress those products have enabled in the world around them.

The brilliance of this video is that it uses others to tell the story. Whereas some companies may have pointed the camera at their own designers and developers (looking at you, Apple), Google put the focus on the users, media, and cultural leaders that have adopted and promoted the products along the way. The resulting video plays more like a historical chapter than a commercial.

7) InVision: Design Disruptors

I want to end this list with a bit of an anomaly, because it pushes at the boundaries of what can be considered a product video and, as such, opens up all sorts of opportunities.

InVision, a prototyping, collaboration, and workflow platform wants to empower designers — their primary users. Much of their content strategy is bent on this mission. This year, InVision will launch a documentary on the role of design in the modern business.

Design Disruptors looks at how 15 top businesses prioritize design in their products and overall user experience. Unlike traditional product videos, Design Disruptors will run in theaters and on Netflix. And unlike traditional product videos, Design Disruptors never actually promotes the product. The goal is bigger than the product.

“We’re trying to bring attention to the increased importance of design in a company’s success,” explains David Malpass, InVision’s vice president of marketing. “A lot of our work is based on doing things that’ll create a positive effect on the design community and that will elevate the role of the designer within their organization.”

Ask Yoast: having a privacy page and SEO

There are several kinds of pages that you’ll expect to find on most websites: a home page, a contact page or an about page, for example. In this Ask Yoast, I’m going to discuss another type of page: the privacy page. This is a page where you put your privacy policy, which allows your visitors to check, for instance, how their information is handled. Not every website will need a such a page, but what about privacy pages and SEO? Is there a benefit to having a privacy page on your site? Read on to find out!

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin

Info

Derek Little sent us an email with the following question:

I’ve heard that having (or not having ) a privacy page is or was a big factor in SEO, and important to Google. Is this still the case?

Watch the video or read the transcript further down the page!

Privacy pages and SEO?

So, does having or not having a privacy page on your site have an impact on SEO?

“Well, as far as I know, this was only ever a factor for the AdWords quality score, not for SEO itself. So, it was important when you were advertising in AdWords, and not in SEO.  

At the same time, having a privacy page on your site makes you look all that more professional, which can help, of course, if you’re selling something. So, I would say, make it, think about what you put on there, think about how you deal with the privacy of your visitors and customers. Put that on there, and show that to the world: that’s always a good thing.

Good luck!”   

Ask Yoast

In the series Ask Yoast we answer SEO questions from our readers. Have an SEO-related question? Let us help you out! Send an email to ask@yoast.com.
(note: please check our knowledge base first, the answer to your question may already be there! For urgent questions, for example about our plugin not working properly, we’d like to refer you to our support page.)

Read more: ‘Holistic SEO’ »

The post Ask Yoast: having a privacy page and SEO appeared first on Yoast.

Snapchat Use Among Influencers Is Down 33% [New Data]

We’ve been following the competition heating up between Snapchat and Instagram since Instagram launched Stories last August.

And in the year since, we’ve been keeping track of the changing tides of disappearing messages and stories — and Instagram is the one to beat.

The number of Instagram Stories users exploded to 250 million users in just a year, far outpacing Snapchat’s 173 million users. Anecdotally, more marketers have said they prefer Instagram Stories, and gradually, social media influencers have stopped flocking to Snapchat in favor of Instagram Stories to connect with their fans in an authentic, unpolished way.

MediaKix conducted a six months-long study to confirm if, in fact, Instagram Stories was beating out Snapchat among top social media influencers — and the results are in.

Among other findings, they found that:

  • Influencers shared twice as many stories on Instagram as on Snapchat.
  • Snapchat experienced a 33% decline in use among influencers.

Read the full infographic below to check out the full findings — and learn if you should be adjusting your social media strategy like an influencer.

Instagram-Stories-vs-Snapchat-Stories-Top-Influencers-Infographic1.png