In the world of Single Page Applications, client-side rendering has long been the go-to method for rendering content. However, as SPAs have evolved, other rendering modes have emerged that offer different advantages and disadvantages. In my talk, we will explore why it's important to go beyond a blank page as initial request and explore different rendering modes like SSR, SSG, ISG and more. We'll not only cover the pros and cons of each mode but also provide real and comparable examples. By the end of the talk, you'll have a better understanding of the different rendering modes available for SPAs, and be able to choose the best one for your needs. Join my talk to explore the art of rendering modes.
The Art of Rendering Modes: Go Beyond a Blank Page
AI Generated Video Summary
1. Introduction to Google and Website Traffic
We start very simple. Google processes 8.6 billion searches per day. What is the percentage of websites that actually gets visitors from Google? Not even 10%. Study shows zero visits for a big part of websites.
Hi, everybody. Wow. So, we start very simple. Who of you uses Google? Who of you uses DuckDuckGo? I see a few hands, nice, awesome. But Google approximately processes 8.6 billion searches per day. That means, doing quick math, 100K searches per second. Crazy amount, right? But from that number, what do you think? We all search for various things, VUE.js live, for example. And we see lots of websites, but what do you think is the percentage of websites that actually gets visitors, like traffic, from Google? Just think of a number between zero and 100, of course. And I want you, right now, to guess it. Just tell me right away, okay? Three, two, one. Interesting. I think someone was very close here. Not even 10%. So there's a study from Ahrefs and it shows clearly, yeah, that big part of the donut here, zero visits. And we'll see how your website will be not in the orange, also not in the red, but best in the green or purple part.
2. Introduction to SEO
Welcome to my talk on The ViewUniverse of SEO. SEO is user-focused and aims to provide the best content and user experience. It's easy to pick up but hard to master, requiring continuous improvement. SEO is frequently changing, similar to web development ecosystems. Publicly available content, such as marketing pages and personal portfolios, benefits from SEO to ensure visibility on search engines.
So welcome to my talk, The ViewUniverse of SEO, Uncovering the Secrets. Yeah, I'm Alex, web dev consultant. I just go very quickly over the slides because, obviously, not that much time for introduction. I got a very nice introduction already. I have a Twitter account. I'm also a master don, have a website, and I'm on GitHub. So I am ready to navigate through the cosmos of SEO.
Nice, I love the energy. But what is SEO even? Well, it could be that it's rocket science, but no, luckily not. And even though it's search engine optimization, it is actually quite user focused. Because it also kind of makes sense. Google and all the other search engines want the best result for the user. So the best content, the best user experience. So they want that you find what you're looking for. Also SEO is quite easy to pick up, but it's hard to master. Like many things, sadly. And it needs continuous improvement. It's not like, okay, I can do my SEO now and I'm done forever. That is not how it works. And it is frequently changing. So we as web developers know frequently changing ecosystems. Sure. SEO, very similar. Oh, there's a core update from Google. Oh, no, these don't rank that well. But don't worry. We won't get into that depth today.
And when do I even need SEO? Well, first of all, your content must be publicly available. Right? So if you have something then authentication, you don't need SEO, it's fine. But if you have like marketing pages, your company, maybe your own portfolio site would be nice if you Google your name, that maybe not your Facebook profile shows up or Twitter handle but maybe your own website if you have one.
3. Introduction to SEO Galaxies
Forums, blogs, articles, and help databases are not relevant for content behind authentication or short-lived content. SEO has three parts: on-page, off-page, and technical. On-page SEO focuses on content, keywords, user experience, and meta tags. Off-page SEO involves link building, social media, brand-building, and authority. Technical SEO covers page speed, performance, security, broken links, and sitemaps. We will focus on on-page and technical SEO using Vue.
That would be great. And of course, forums, help databases, and so on, blogs, articles. And as I said, it's not really relevant for anything behind authentication. And if you have content that's there for a couple of days, maybe not longer, then it probably won't make sense to optimize that if it's gone very quickly, likely doesn't need it.
So we have kind of three parts of SEO, let's say three SEO galaxies. One of it is on page, on page SEO. So this wonderful purple thing here. This mainly covers content because content is still king. If you write good things on a website, people looking for these things, okay, you are like an expert in your area, for example. That's very important. SEO can only be one optimized if your content is really good. And of course, it's about key words. So figure out, okay, what are people actually looking for? How is the terminology and what intent is behind there? User experience is also important because obviously if they can't navigate on your site, yeah, well they will leave very quickly. And also setting your meta tags which we'll take a look at in a bit and so on and so on. Okay. That's on page.
We have also off page. So off page is this wonderful galaxy here. And there it's about link building. So let other sites, other websites link to your page, maybe because they like the content. It's like, oh, that's a good reference and so on. Same for social media, brand-building, citation, so people just like name the name of your project or of your company and authority in general. You have to show that you're an expert in what you're writing. That's the point here.
And there's technical SEO, the third here. And that's about, we've heard a lot now, page speed, performance, security as well, actually, broken links, sitemap, and so on, so on. So these are just some parts of it. And we will focus on two, on on page and technical, because of time. And well, Vue covers exactly these two things. So let's check that out and see what happens.
So, what Google's doing, if you have a very classic, the dev tools, and you do something like... Whoops. Something like this... Awesome. So, when you have the emulator, this tool is... Okay, I will use a mobile screen. Then Google also does that, like, mobile screen, 400, 500 pixels, but the length will just be 9000 pixels. So they don't have to scroll, they just get all the content they need from these 9000 pixels, and they don't trigger any scrolling or similar. Okay. Weird.
5. Challenges with SPAs and Solutions
6. Nux.js Route Rules and Future Features
You can do pre-rendering, static site generation, and ISR with Nux.js. Rolling your own server rendering is possible but requires a lot of work. Nux allows you to define route rules, which can be configured on the page component itself. This feature is coming in the future.
You can do pre-rendering, static site generation, ISR. We heard a lot about that from Alba already. We can also nicely mix them up. And you also know which framework can do that. No surprise. Nux.js.
You can also roll your own server rendering of course. That's totally possible. But it's tons of work. Trust me. I've seen these setups. I went through them. And it's like, OK, we have our custom setup. How do we migrate? It's tough.
And you can define route rules. Whoever used route rules in Nux? One, two, three. OK. So you can say, here, for example, every route that begins with blog and then has any kind of, let's say, slug article name, should be, for example, static as in 9.5. So say, OK, this will be rendered once, then cached forever. Or stay and revalidate. Or we disable SSR right away. And the cool thing is, this is usually configured in your nux config. But, little spoiler, soon that's not necessary anymore. You can configure it on the page component itself. You can say, define route rules and say, OK, this page, SSR false. No problem. I don't have a long, long list in the route rules anymore in the nux config. But I have this very nicely located in the page component. There's a PR up for that. So this will come in the future.
7. Achieving Good SEO with HTTPS and Security Headers
To achieve good SEO, enable HTTPS for your website. It's a must-have nowadays and a ranking factor. Also, implement security headers and content security policies to create trust for visitors.
And now, let's see what's needed to achieve good SEO. So I made a little overview. We always have a feature or task or similar. Then we check if it's a technical or on-page topic and the effort. One rocket is very long and crude. Let's just go for it. And we start once again with very simple things. Basic. Basic security. It sounds like, ah, okay. What's this? Well, who of you does use HTTPS in their production environment? Okay. Who does not use HTTPS? Okay, one, two, three. You really should. You really should enable HTTPS. I mean, that's a must have nowadays and it's a ranking factor. So Google will treat, given the same website, HTTPS, on or off, better when it's on. And thanks to Let's Encrypt and CloudFront and so on, it's not a big deal anymore. Also, good practice, security headers, content security policies. These are not really related to SEO. But of course, if you're once on it, you should really do it. So I want to also create trust for visitors. And as I said, it's long and fruit. HTTPS is a ranking factor.
8. Mobile Friendliness and Core Web Vitals
Mobile friendliness is a must-have for websites. Google crawls with a mobile screen, so ensure your site is responsive and does not have font or overflow issues. Mobile friendliness is a ranking factor. Use the mobile friendliness tool from Google to test your site.
Mobile friendliness. Same idea here. Once again, a must have. But the effort depends a little bit on how your site looks right now. Maybe you don't need to do anything because it's already responsive or nice. But, yeah, Google, actually, as we just discussed with the scrolling part, it crawls by default with a mobile, not only a user agent, but also with a mobile screen. Which means, if you have a font that's too small or goes over it, has an overflow, a scrolling overflow, that's not that good. And the mobile friendliness is actually a ranking factor. There's also a tool that you can use to test it out. It's called the mobile friendliness tool from Google. So that's worth to check.
9. Core Web Vitals, Text Compression, Broken Links
Core Web Vitals are essential metrics for a healthy site. LCP, FID, and CLS are ranking factors. Text compression is a must-have for faster websites. Broadly is a faster and smaller alternative to Gzip. Check your site with a compression checker. Broken links and redirects can frustrate users, so check regularly and set up redirects for changed URLs.
And then we have the Core Web Vitals, Philippe already mentioned them. Their so-called essential metrics for a healthy site. At Google we see LCP, FID, and CLS here. These are ranking factor. And also, as was mentioned before, FID will be replaced by interaction to next pane very soon. So also there, gather some metrics and find out if you fulfill these Core Web Vitals or not.
Effort depends a little bit, again, how the state of the site is. Really hard to gauge from, yeah, guessing. And then we have a very easy one, once again, which is text compression. Text compression is another must-have nowadays because it's very easy to set up, all browsers support it. And commonly, you use something like Gzip, right? That's the default almost. But nowadays we also have Broadly for more than seven years now, I think. It's faster, has smaller files as output so it's always good to choose that if your server or provider supports that. So you can set it up in your, let's say, Apache, NGINX or if you use a CDN, it's usually on there by default. That's not a problem. And you can then use a checker to, for example, check your site. Also, good text compression means lower transmission size, so faster website. for example, or in this case, Nitro, supports the compression for static assets by default, for on-the-fly requests, as I just mentioned, you usually give that over to your web server or platform provider. So if you host on Netlify, it's on by default. Broadly is there. Or if you use Cloudflare, you can just toggle and say, okay, I want to enable Broadly. Good. Once again, knowing the fruit, go for it.
And then another very common thing, broken links and redirects. It's like, okay, make sure you don't link to broken pages. It's not a surprise. We all know, if a user clicks on links not there, they will be frustrated and you might lose a visitor on your page. So of course don't do that, but what you should do is you should check regularly for broken links on your page and also to your page. Let's say, yeah, you've got the link from a newspaper or a friendly Dev linked to your portfolio, but maybe the sub-page is not available anymore. Because you changed the slug, you changed the URL, what you should do is you should set up redirects for them.
10. URL Changes, Redirects, and Canonical Links
Whenever you change a URL, keep the old one and set up a redirect. Resolve redirect chains and keep them consistent. Set a canonical link for every page to indicate the preferred version of your site. Handle trailing slashes consistently and point search engines to the correct version. Canonical links help avoid duplicate content and improve ranking.
So whenever you change a URL, keep the old one, set up a redirect, very important. Otherwise, you might lose visitors, because if they click on link it's broken. Say, okay, then I get my info from somewhere else. If you do that more often, change URLs more often, you might run into redirect change.
So slash A redirects to B to C. The best is to resolve these change and then redirect A to C and B to C straight away. But keep these redirects. Just don't change them. And once again, it's possible, for example, for a web server platform provider on AppDefine or Versailles, you can set them up, but also you can use, once again, the route rules, or if you have some more complicated ones, server middleware to say, okay, in this case I want to do a lookup in database and redirect to the best slug fitting and so on. So that's usually something that the server part is covering.
And then we have canonical links. Anybody knows what canonical links are? Okay. A few. Great. Very nice. So, the idea is, you should set a canonical link for every page, and a canonical link is the, let's say, preferred version of your site. And you're like, my version, I mean, this should just be one page per thing I write. Of course, of course, but there are lots of URLs that could point there, also for duplicate content. We'll have a look in a second. It's very important to also handle the trading slash enforcement here. So let's say we have this tag link rel canonical, and then we put in the link there. And the idea could be, okay, if I have abc.com or www.abc.com. Or do I have a trading slash in the end or not? That's important, and you should enforce one version. It doesn't really matter if you use www in front or not, or have a trading slash at the end or not. That's fine. Both is fine. Just be consistent. And point also to Google and to other search engines and say, ah, that's the version that you should index. Otherwise, they might index two different versions and they might both rank lower than the actual single version that you could have pointed to. And also, if you say, like an e-commerce store, you have shoes, then you have like a special deal.
11. Handling Duplicated Content and URL Structure
It is for some reason duplicated content here saying, okay, yeah, I want to show these Nike Air Max in the special section with the same contents of the shoes then also here you should provide the link to the everlasting URL, its canonical URL if it's exactly the same or duplicated content.
And there are lots of versions. So your web server or a platform provider should handle HTTP, HTTPS handling, for example. You have HTTP fully redirecting it after naming it to HTTPS. Same for WWW versus non WWW handling. Decide for one version and then convert it.
And your frontend part should handle the canonical link of, okay, let's strip all the query parameters because we might not need it. Let's strip a trading slash or leave it there or edit. Let's remove the hash part. All these are important that we in the end get this very simple canonical link. So once again it's fine to have it with WWW in front or no trading slash. But be consistent. That's the key here.
And then we have the URL structure. Of course we want short and descriptive URLs. Nobody remembers like slash id, slash 1ac, whatever. I don't either.
12. URLs, Meta Tags, and NOCs
We should use descriptive URLs and remove stop words. Trending slashes and placing keywords in the URL are important. Avoid query parameters and use hyphens as delimiters. Choose between non-www and www. Optimize the URL by shortening it and adding relevant categories. Meta tags, including title and meta description, are crucial for search results. Use UTF-8 char set and initial viewport. Test and optimize title and meta description. Utilize OG tags for improved link previews. Set meta tags and NOCs through UseHat.
So that should be clear. We should use descriptive URLs and also we can remove the stop words. That's also fine.
Also here, as just said, trending slashes, yes or no, it's up to you, but you should enforce it there too. And you should place your keywords on the site. So if you're writing a blog post about Nuxt.js then in your URL, the Nuxt part should be there, or if you write a blog post about image optimization, it would be nice that image optimization is in the URL itself. Not only that the search engine understands it, but also that the user can just type an image in the URL and if they were at your site, nice, they get it right away.
And query parameters, they should be avoided unless you really need them, but it's usually better to just work with slugs there. And as delimiters, you usually should use hyphens. Also once again, non-www vs. www, same idea here. So if you have something like this, like abc.com slash question mark id is whatever, horrible, please don't. You can improve that by removing the query parameters, but then, yeah, still not really descriptive, so let's get it a little better. Okay, nice blog post about benefit of not cool, but delimiters are still off and maybe yeah, there's kind of a category here to see, right? Like blog and then maybe post about... still, it's a bit too lengthy, so maybe just short it a little bit down. And that's a very nice short URL, so if you stick to simplifying it as much as you can, you're on a good track.
Okay, and then we have meta tags. So if we have a search and a result on a search and a result page, here, we have a title and a meta description. These are meta tags. So are the UTF-8 char set and the initial viewport. So we should find the ideal title and the meta description for each page. Testing is, of course, necessary, but it's really worth it. And you should also use the OG tags to improve the link previews that I just mentioned. And meta tags and NOCs, they're very easy to set. So for example, these are set by default. Nice. And otherwise you can set everything through UseHat. And we will have a look at that right away. So I'm happily switching my tab over here to Hello London. We'll quickly refresh this.
13. Introduction to DevTools and Meta Tags
The DevTools now have an open graph tab that shows meta tags, missing tags, and provides code snippets to easily add them. It also provides a preview for Twitter, Facebook, and LinkedIn. This feature was released recently and makes it easier than ever to get your meta tags right for SEO.
And we have the DevTools here. And in the DevTools, that has been released very recently, there is like an open graph tab here, and it shows you exactly that. It shows you the meta tags, it shows you, for example, the body attributes I said, and it shows also missing tags, saying, okay, there are required tags, there are recommended tags, you should really add them. And there's a code snippet that's just saying, hey. Why don't you add that right away and then fill it in. So you can just copy-paste that right away, saying, okay, please, let's just add that, and then the DevTool will be happy. Plus, if you have all the tags, you get a preview for Twitter, Facebook, LinkedIn, for the preview tag, for the preview of the URL. So we have the image there, the title there, the sitename. And this has been released a couple days ago, and it's working pretty nicely in any Nuxt applications with the DevTools setup. So it never has been easier to get the meta tags right for your SEO.
14. Tips on HTML, SEO Kit, and Pre-rendering
Please use semantic HTML. Use alt text for images. Don't stuff images with keywords. Use Naxt SEO Kit for sitemaps, robots.txt, and schema.org. Shout out to sponsors. Thanks for your attention. Pre-render.io can be a good service for static sites.
And there's so much more, but not enough time, of course. I could always tell you, please use semantic HTML. If you're linked to another page, don't use a button, please. Use a link. Use alt text for images, very important. And please don't stuff them with keywords, just describe what the image is about. It will help every user and it will also help the search engine.
Talking about page structure, sitemaps, structured data, robots.txt, hreflang, there's more. But I want to give you one more life hack. Use Naxt SEO Kit, because it comes with a sitemap module, it comes with a robots.txt module, it comes with schema.org, so easier schema generation, also for SEO structured data. It comes with more experimental features that could improve the performance as well. With dynamic OGE image generation and a link checker for broken links. That module is maintained by Harlan, also a team member of Naxt.js, and he's also maintaining the integration of the use head that we see. So, yeah, from here on, just say give it a try, try it out, use the dev tools, use the modules.
And I want to shout out my sponsors, thanks a lot for giving me the opportunity to work on open source. And yeah, thanks you for your attention, thanks a lot if you want the slides, they are right free to scan the QR code or put a link. And yeah, that's it for me, thanks a lot. Thanks a lot, Alex. This was really insightful. Already comfortable here. Yeah, quite nice. I'll try that one day. Yeah, I remember a few years ago, I actually started building my own website with React. I didn't even know about Vue back then. And then I noticed when I share links, nothing is visible. So I used pre-render.io. What do you think about it? I think it can be a good service for sure. It will make things easier but it depends a bit on the kind of data you have. So if you have a static site basically, you can use pre-render.io. But you can also just statically render the site and you don't need a third party service.
AI in SEO and the Changing Paradigm
Using AI in SEO can be helpful for people who are not familiar with it. AI can assist in describing content and finding relevant categories. However, AI is unlikely to replace the expertise of SEO specialists. The growth of AI-powered crawlers may reduce the need for manual SEO efforts. Google's focus on generative AI in search may impact the traditional SEO paradigm, especially with the trend towards zero-click search.
It can be a nice option if you have a big SPA and the data is not super near real time so it doesn't change that often. Then it can be pretty nice to have a good migration way to eventually generate some things statically. So a good thing if the data is not too often changing.
Yeah, it was a personal web service, it was a quick fix.
Yeah, true, true. Exactly. You know how it is. Of course.
Okay, let's jump into questions. For the first one, a pretty interesting question. With more people using chat.gpt and AI instead of Google, how do you think that will affect SEO? Interesting. I think it can help, especially people who are not into SEO. There are people doing SEO in companies full-time, right? And with AI, I think it can be easier to get into SEO in general to also figure ways to describe the content better. I don't think necessarily AI will replace the classic SEO specialist or the knowledge of people of SEO, but I think it can be really helpful just to get things right. I also sometimes ask, okay, how to summarize this and this? Give me a good overview of categories that could fit in there. So just to figure out more ways of describing your content or more categories to put it in, for example. 100%. Crawlers will be powered by AI and then we won't need to take that much care of SEO. True. I mean, I also think there's a tool called AutoGPT, and you can already say, like, yeah, crawl that site, describe the content, and that can only be automated already in a certain manner. Yeah. Knowing the AI growth recently, that will happen in like two weeks. True. Awesome. So next question. In Google I.O., Google announced how generative AI will be the key when using Google Search. How do you think the SEO paradigm will change so that it's your website solution, the one returned by Google's AI? Yeah. I think that can be a bit difficult because we already saw the trend for quite a while to go to zero-click search. So one-click search means you search something, you click on the first result, you get the info. That was very common.
SEO and Accessibility
Nowadays, zero-click is becoming more common, where Google provides information directly without the need to click on a website. While this benefits users by providing quick information, it can result in less traffic for websites. However, SEO is still important in the social media age because it allows people to find your content through various channels. Social media is just another channel, and crawlers even crawl social media posts. In terms of improving accessibility and SEO, they go hand in hand. Using alt tags for images helps both accessibility and SEO. Monitoring the improvement of SEO and accessibility can be done through various methods.
But nowadays, it's already zero-click as in you look for something, then Google already gives you like a little preview or a snippet of like, hey, this is how it works, or like a frequently asked question, you can just open it and like, okay, I don't even click on a website anymore. Google gives me that info that it got from your website. And I think with AI there, it will be the next step, next evolution of that, which means less traffic for your website, which is, on the one hand, good because people get information quicker. On the other hand, you lose traffic, you lose visitors. So that's, of course, a bad thing.
Okay. Thanks for this long answer. Let's jump on the next one. Is SEO still as important in the social media age? I would say yes, because there are lots of ways how people can find your content. And I mean, just imagine how many Google searches we are doing, or Dr. Go searches, or Bing, or whatever you guys use every day. I definitely think this is very valuable. Social media is just another channel. So ideally, you want traffic from all the ways, because people share the link on social media, because people search for your content, because you share that. So I think yes, it's still very important. Yeah, a hundred percent. Like crawlers even crawl social media posts and everything. True, absolutely. So yeah, a hundred percent.
Okay. Let's jump to the next question. What possibilities are there to see how the SEO of our page improves the better accessibility? Do you think it could be taken into account in the future? Can you read the question once again, please? Yeah, it's confusing. What possibilities are there to see how the SEO of our page improves the better accessibility? Okay, I see. So I mean, accessibility as an a11y and SEO are going hand in hand, first of all, because as I just mentioned, use alt tags. This helps accessibility. This helps also SEO. Because the user who might not see the image can understand the search engine calls understand it. In lots of ways, it's also user first. You should focus on making them happy. But how to monitor, how things get better? Well, there's a good way for that.
Importance of Google Search Console
Register your site on Google Search Console to track impressions and search results. Monitor changes and analyze the impact of your modifications. It provides valuable insights on keyword performance and user search trends.
So there's Google Search Console, you can register your site up there. And you get stats like how many impressions you got. So, how often were you in the search results for what words, what pages, what country, and then you change something. And usually you have to be patient, because it takes a while until Google picks something up. And then to see, okay, either nothing changed, or like, you've got the X percent increase, or oh, for this keyword, more and more people searching for it. So registering your site to Google Search Console and keeping an eye on there is the thing, also a very important part.