The Art of Rendering Modes: Go Beyond a Blank Page

Rate this content
Bookmark

In the world of Single Page Applications, client-side rendering has long been the go-to method for rendering content. However, as SPAs have evolved, other rendering modes have emerged that offer different advantages and disadvantages. In my talk, we will explore why it's important to go beyond a blank page as initial request and explore different rendering modes like SSR, SSG, ISG and more. We'll not only cover the pros and cons of each mode but also provide real and comparable examples. By the end of the talk, you'll have a better understanding of the different rendering modes available for SPAs, and be able to choose the best one for your needs. Join my talk to explore the art of rendering modes.

32 min
12 May, 2023

Video Summary and Transcription

Google processes billions of searches per day, but less than 10% of websites get visitors from Google. SEO is user-focused and requires continuous improvement. JavaScript used to be a challenge for search engines, but now they can handle it. Server-side rendering is a solution for the challenges of single-page applications. Good SEO includes HTTPS, mobile friendliness, core web vitals, and handling URL changes. Meta tags and accessibility are important for SEO. Google Search Console provides valuable insights for tracking keyword performance.

Available in Español

1. Introduction to Google and Website Traffic

Short description:

We start very simple. Google processes 8.6 billion searches per day. What is the percentage of websites that actually gets visitors from Google? Not even 10%. Study shows zero visits for a big part of websites.

Hi, everybody. Wow. So, we start very simple. Who of you uses Google? Who of you uses DuckDuckGo? I see a few hands, nice, awesome. But Google approximately processes 8.6 billion searches per day. That means, doing quick math, 100K searches per second. Crazy amount, right? But from that number, what do you think? We all search for various things, VUE.js live, for example. And we see lots of websites, but what do you think is the percentage of websites that actually gets visitors, like traffic, from Google? Just think of a number between zero and 100, of course. And I want you, right now, to guess it. Just tell me right away, okay? Three, two, one. Interesting. I think someone was very close here. Not even 10%. So there's a study from Ahrefs and it shows clearly, yeah, that big part of the donut here, zero visits. And we'll see how your website will be not in the orange, also not in the red, but best in the green or purple part.

2. Introduction to SEO

Short description:

Welcome to my talk on The ViewUniverse of SEO. SEO is user-focused and aims to provide the best content and user experience. It's easy to pick up but hard to master, requiring continuous improvement. SEO is frequently changing, similar to web development ecosystems. Publicly available content, such as marketing pages and personal portfolios, benefits from SEO to ensure visibility on search engines.

So welcome to my talk, The ViewUniverse of SEO, Uncovering the Secrets. Yeah, I'm Alex, web dev consultant. I just go very quickly over the slides because, obviously, not that much time for introduction. I got a very nice introduction already. I have a Twitter account. I'm also a master don, have a website, and I'm on GitHub. So I am ready to navigate through the cosmos of SEO.

Nice, I love the energy. But what is SEO even? Well, it could be that it's rocket science, but no, luckily not. And even though it's search engine optimization, it is actually quite user focused. Because it also kind of makes sense. Google and all the other search engines want the best result for the user. So the best content, the best user experience. So they want that you find what you're looking for. Also SEO is quite easy to pick up, but it's hard to master. Like many things, sadly. And it needs continuous improvement. It's not like, okay, I can do my SEO now and I'm done forever. That is not how it works. And it is frequently changing. So we as web developers know frequently changing ecosystems. Sure. SEO, very similar. Oh, there's a core update from Google. Oh, no, these don't rank that well. But don't worry. We won't get into that depth today.

And when do I even need SEO? Well, first of all, your content must be publicly available. Right? So if you have something then authentication, you don't need SEO, it's fine. But if you have like marketing pages, your company, maybe your own portfolio site would be nice if you Google your name, that maybe not your Facebook profile shows up or Twitter handle but maybe your own website if you have one.

3. Introduction to SEO Galaxies

Short description:

Forums, blogs, articles, and help databases are not relevant for content behind authentication or short-lived content. SEO has three parts: on-page, off-page, and technical. On-page SEO focuses on content, keywords, user experience, and meta tags. Off-page SEO involves link building, social media, brand-building, and authority. Technical SEO covers page speed, performance, security, broken links, and sitemaps. We will focus on on-page and technical SEO using Vue.

That would be great. And of course, forums, help databases, and so on, blogs, articles. And as I said, it's not really relevant for anything behind authentication. And if you have content that's there for a couple of days, maybe not longer, then it probably won't make sense to optimize that if it's gone very quickly, likely doesn't need it.

So we have kind of three parts of SEO, let's say three SEO galaxies. One of it is on page, on page SEO. So this wonderful purple thing here. This mainly covers content because content is still king. If you write good things on a website, people looking for these things, okay, you are like an expert in your area, for example. That's very important. SEO can only be one optimized if your content is really good. And of course, it's about key words. So figure out, okay, what are people actually looking for? How is the terminology and what intent is behind there? User experience is also important because obviously if they can't navigate on your site, yeah, well they will leave very quickly. And also setting your meta tags which we'll take a look at in a bit and so on and so on. Okay. That's on page.

We have also off page. So off page is this wonderful galaxy here. And there it's about link building. So let other sites, other websites link to your page, maybe because they like the content. It's like, oh, that's a good reference and so on. Same for social media, brand-building, citation, so people just like name the name of your project or of your company and authority in general. You have to show that you're an expert in what you're writing. That's the point here.

And there's technical SEO, the third here. And that's about, we've heard a lot now, page speed, performance, security as well, actually, broken links, sitemap, and so on, so on. So these are just some parts of it. And we will focus on two, on on page and technical, because of time. And well, Vue covers exactly these two things. So let's check that out and see what happens.

4. Understanding JavaScript and Search Engines

Short description:

For classical SPAs, if JavaScript is disabled or breaks, the page will be blank. Search engines used to struggle with JavaScript, but that's no longer the case. However, there are still some caveats. Google won't interact with the page, including buttons and links. It also won't have permissions for camera notifications and scrolling. Instead, Google uses a mobile screen emulator to capture content without scrolling.

But first, once again, we have Vue. We have an SPA, right. And SPA, we all know that, generates the HTML through the JavaScript. Which means, all fine, as long as the JavaScript, but what happens if, JavaScript will not be there if it's disabled or if maybe something breaks? Because, I don't know, you didn't use TypeScript and maybe there's an exception. This will happen. You see nothing. Exactly, a blank page, usually white. Maybe even with a loading spinner. So when there is no JavaScript, there cannot be any content. Right? For classical SPA, that's the case.

So, yeah, a few years ago, luckily, quite some time, this was even a problem for search engines because they could not understand JavaScript. Google's like, no idea. Luckily, that's not a thing anymore. But there are still some caveats. Right? So, when Google's crawling a website and indexing it, there are a few things that will happen or will not happen. So, for example, there are no interactions. Google won't push any buttons, it won't navigate, even if you have the nicest page transition, they will not click on any links there. will take them and then, like, open them separately on a totally new process, so to say. It also means no permissions for, like, camera notifications. There will also be no scrolling, which is really interesting.

So, what Google's doing, if you have a very classic, the dev tools, and you do something like... Whoops. Something like this... Awesome. So, when you have the emulator, this tool is... Okay, I will use a mobile screen. Then Google also does that, like, mobile screen, 400, 500 pixels, but the length will just be 9000 pixels. So they don't have to scroll, they just get all the content they need from these 9000 pixels, and they don't trigger any scrolling or similar. Okay. Weird.

5. Challenges with SPAs and Solutions

Short description:

No data persistence. Scroll budget affects websites with large files. Late-loaded content may not be indexed by Google. SPAs have difficulty generating preview images. JavaScript errors can break the entire site. Fragment routing is no longer possible with Vue. Real URLs are necessary for Google. Client-side rendering may not be sufficient for ranking well. Server-side rendering is the solution.

God, we have that. And there's also no data persistence. So, if you set cookies, if you go to local store, you say, okay, I'll set some settings, this will be cleared. And there's a so-called scroll budget. If you have lots of files, Google is caching them, but of course, it should be fair amount for every website. So they have a budget that's not disclosed, but if you have lots of big data, like big JavaScript files, images, they will scroll the pages slower than others.

And also, of course, there's a timeout thing, so if you load data through SPA, you have the loading spinner maybe, sure. But at some point, Google might not index content that comes in very late. Yeah, that's for crawling, and SPAs also have some caveats. For example, if you share your favourite link on, let's say, Discord, with your friends, or WhatsApp or wherever, then usually you have a nice preview image and title, description. And for SPAs, it's a bit difficult, because usually the preview images come from the HTML, which is not there by default, because it's generated through JavaScript. So always, a bit of a problem here, there's no preview or a default preview image, and the experience could be way better.

Also, we just learned no JavaScript, no content. If you have HTML and make a mistake, it's a bit more forgiving, but with JavaScript, it's like, yeah, one mistake could lead the whole site to not work anymore. And especially for Vue, you can't reuse fragment routing anymore. This applies to any framework. But for Vue router it's also important you need real URLs. If you only use a fragment, that won't work for Google. Actually, this fragment part is not even processed in the back end. It won't reach the back end, even. So you need actual URLs, the history mode. Which means it's a little bit more work to set up for multipage applications, especially if you have your own web server and so on, anyway. And if there's one thing to quote from this talk, it's this. Even if it doesn't matter, it's fine, Google can index my page, it's all good. But it does not mean it will rank well. Just because Google can understand your website and see it, it doesn't mean it's the best page ever. So maybe, in this case, also client-side rendering might not be enough. And what else can we do there? Well, you already know the answer. Server-side rendering. Nice.

6. Nux.js Route Rules and Future Features

Short description:

You can do pre-rendering, static site generation, and ISR with Nux.js. Rolling your own server rendering is possible but requires a lot of work. Nux allows you to define route rules, which can be configured on the page component itself. This feature is coming in the future.

You can do pre-rendering, static site generation, ISR. We heard a lot about that from Alba already. We can also nicely mix them up. And you also know which framework can do that. No surprise. Nux.js.

You can also roll your own server rendering of course. That's totally possible. But it's tons of work. Trust me. I've seen these setups. I went through them. And it's like, OK, we have our custom setup. How do we migrate? It's tough.

And you can define route rules. Whoever used route rules in Nux? One, two, three. OK. So you can say, here, for example, every route that begins with blog and then has any kind of, let's say, slug article name, should be, for example, static as in 9.5. So say, OK, this will be rendered once, then cached forever. Or stay and revalidate. Or we disable SSR right away. And the cool thing is, this is usually configured in your nux config. But, little spoiler, soon that's not necessary anymore. You can configure it on the page component itself. You can say, define route rules and say, OK, this page, SSR false. No problem. I don't have a long, long list in the route rules anymore in the nux config. But I have this very nicely located in the page component. There's a PR up for that. So this will come in the future.

7. Achieving Good SEO with HTTPS and Security Headers

Short description:

To achieve good SEO, enable HTTPS for your website. It's a must-have nowadays and a ranking factor. Also, implement security headers and content security policies to create trust for visitors.

And now, let's see what's needed to achieve good SEO. So I made a little overview. We always have a feature or task or similar. Then we check if it's a technical or on-page topic and the effort. One rocket is very long and crude. Let's just go for it. And we start once again with very simple things. Basic. Basic security. It sounds like, ah, okay. What's this? Well, who of you does use HTTPS in their production environment? Okay. Who does not use HTTPS? Okay, one, two, three. You really should. You really should enable HTTPS. I mean, that's a must have nowadays and it's a ranking factor. So Google will treat, given the same website, HTTPS, on or off, better when it's on. And thanks to Let's Encrypt and CloudFront and so on, it's not a big deal anymore. Also, good practice, security headers, content security policies. These are not really related to SEO. But of course, if you're once on it, you should really do it. So I want to also create trust for visitors. And as I said, it's long and fruit. HTTPS is a ranking factor.

8. Mobile Friendliness and Core Web Vitals

Short description:

Mobile friendliness is a must-have for websites. Google crawls with a mobile screen, so ensure your site is responsive and does not have font or overflow issues. Mobile friendliness is a ranking factor. Use the mobile friendliness tool from Google to test your site.

Mobile friendliness. Same idea here. Once again, a must have. But the effort depends a little bit on how your site looks right now. Maybe you don't need to do anything because it's already responsive or nice. But, yeah, Google, actually, as we just discussed with the scrolling part, it crawls by default with a mobile, not only a user agent, but also with a mobile screen. Which means, if you have a font that's too small or goes over it, has an overflow, a scrolling overflow, that's not that good. And the mobile friendliness is actually a ranking factor. There's also a tool that you can use to test it out. It's called the mobile friendliness tool from Google. So that's worth to check.

9. Core Web Vitals, Text Compression, Broken Links

Short description:

Core Web Vitals are essential metrics for a healthy site. LCP, FID, and CLS are ranking factors. Text compression is a must-have for faster websites. Broadly is a faster and smaller alternative to Gzip. Check your site with a compression checker. Broken links and redirects can frustrate users, so check regularly and set up redirects for changed URLs.

And then we have the Core Web Vitals, Philippe already mentioned them. Their so-called essential metrics for a healthy site. At Google we see LCP, FID, and CLS here. These are ranking factor. And also, as was mentioned before, FID will be replaced by interaction to next pane very soon. So also there, gather some metrics and find out if you fulfill these Core Web Vitals or not.

Effort depends a little bit, again, how the state of the site is. Really hard to gauge from, yeah, guessing. And then we have a very easy one, once again, which is text compression. Text compression is another must-have nowadays because it's very easy to set up, all browsers support it. And commonly, you use something like Gzip, right? That's the default almost. But nowadays we also have Broadly for more than seven years now, I think. It's faster, has smaller files as output so it's always good to choose that if your server or provider supports that. So you can set it up in your, let's say, Apache, NGINX or if you use a CDN, it's usually on there by default. That's not a problem. And you can then use a checker to, for example, check your site. Also, good text compression means lower transmission size, so faster website. for example, or in this case, Nitro, supports the compression for static assets by default, for on-the-fly requests, as I just mentioned, you usually give that over to your web server or platform provider. So if you host on Netlify, it's on by default. Broadly is there. Or if you use Cloudflare, you can just toggle and say, okay, I want to enable Broadly. Good. Once again, knowing the fruit, go for it.

And then another very common thing, broken links and redirects. It's like, okay, make sure you don't link to broken pages. It's not a surprise. We all know, if a user clicks on links not there, they will be frustrated and you might lose a visitor on your page. So of course don't do that, but what you should do is you should check regularly for broken links on your page and also to your page. Let's say, yeah, you've got the link from a newspaper or a friendly Dev linked to your portfolio, but maybe the sub-page is not available anymore. Because you changed the slug, you changed the URL, what you should do is you should set up redirects for them.

10. URL Changes, Redirects, and Canonical Links

Short description:

Whenever you change a URL, keep the old one and set up a redirect. Resolve redirect chains and keep them consistent. Set a canonical link for every page to indicate the preferred version of your site. Handle trailing slashes consistently and point search engines to the correct version. Canonical links help avoid duplicate content and improve ranking.

So whenever you change a URL, keep the old one, set up a redirect, very important. Otherwise, you might lose visitors, because if they click on link it's broken. Say, okay, then I get my info from somewhere else. If you do that more often, change URLs more often, you might run into redirect change.

So slash A redirects to B to C. The best is to resolve these change and then redirect A to C and B to C straight away. But keep these redirects. Just don't change them. And once again, it's possible, for example, for a web server platform provider on AppDefine or Versailles, you can set them up, but also you can use, once again, the route rules, or if you have some more complicated ones, server middleware to say, okay, in this case I want to do a lookup in database and redirect to the best slug fitting and so on. So that's usually something that the server part is covering.

And then we have canonical links. Anybody knows what canonical links are? Okay. A few. Great. Very nice. So, the idea is, you should set a canonical link for every page, and a canonical link is the, let's say, preferred version of your site. And you're like, my version, I mean, this should just be one page per thing I write. Of course, of course, but there are lots of URLs that could point there, also for duplicate content. We'll have a look in a second. It's very important to also handle the trading slash enforcement here. So let's say we have this tag link rel canonical, and then we put in the link there. And the idea could be, okay, if I have abc.com or www.abc.com. Or do I have a trading slash in the end or not? That's important, and you should enforce one version. It doesn't really matter if you use www in front or not, or have a trading slash at the end or not. That's fine. Both is fine. Just be consistent. And point also to Google and to other search engines and say, ah, that's the version that you should index. Otherwise, they might index two different versions and they might both rank lower than the actual single version that you could have pointed to. And also, if you say, like an e-commerce store, you have shoes, then you have like a special deal.

11. Handling Duplicated Content and URL Structure

Short description:

It is important to handle duplicated content and provide canonical URLs. Ensure consistency in handling HTTP, HTTPS, and WWW versus non-WWW versions. Strip unnecessary query parameters, trading slashes, and hash parts to create a simple canonical link. Asset optimization, including image optimization and JavaScript performance improvement, is crucial for SEO. Use descriptive filenames for images. Short and descriptive URLs are preferred.

It is for some reason duplicated content here saying, okay, yeah, I want to show these Nike Air Max in the special section with the same contents of the shoes then also here you should provide the link to the everlasting URL, its canonical URL if it's exactly the same or duplicated content.

And there are lots of versions. So your web server or a platform provider should handle HTTP, HTTPS handling, for example. You have HTTP fully redirecting it after naming it to HTTPS. Same for WWW versus non WWW handling. Decide for one version and then convert it.

And your frontend part should handle the canonical link of, okay, let's strip all the query parameters because we might not need it. Let's strip a trading slash or leave it there or edit. Let's remove the hash part. All these are important that we in the end get this very simple canonical link. So once again it's fine to have it with WWW in front or no trading slash. But be consistent. That's the key here.

Then we have asset optimization. Well, we heard a lot about image optimization in Jakob's talk and general performance today. So very easy. I'll skip through most of that. But it's very important to do that also for SEO because better performance means faster website, means happier user, means happier search engine. Also of course, JavaScript can be improved a lot, performance-wise, tree-shaking, code-splitting, analyzing what you actually need and the static assets should be cached of course. Good thing here too, we usually do it already. Vite is having that feature. Nuxt is also giving it out of the box. No problem. And please use descriptive filenames for images. Don't name it img1 or img2. Actually give it a name because search engines can even infer from the name what it could be about.

And then we have the URL structure. Of course we want short and descriptive URLs. Nobody remembers like slash id, slash 1ac, whatever. I don't either.

12. URLs, Meta Tags, and NOCs

Short description:

We should use descriptive URLs and remove stop words. Trending slashes and placing keywords in the URL are important. Avoid query parameters and use hyphens as delimiters. Choose between non-www and www. Optimize the URL by shortening it and adding relevant categories. Meta tags, including title and meta description, are crucial for search results. Use UTF-8 char set and initial viewport. Test and optimize title and meta description. Utilize OG tags for improved link previews. Set meta tags and NOCs through UseHat.

So that should be clear. We should use descriptive URLs and also we can remove the stop words. That's also fine.

Also here, as just said, trending slashes, yes or no, it's up to you, but you should enforce it there too. And you should place your keywords on the site. So if you're writing a blog post about Nuxt.js then in your URL, the Nuxt part should be there, or if you write a blog post about image optimization, it would be nice that image optimization is in the URL itself. Not only that the search engine understands it, but also that the user can just type an image in the URL and if they were at your site, nice, they get it right away.

And query parameters, they should be avoided unless you really need them, but it's usually better to just work with slugs there. And as delimiters, you usually should use hyphens. Also once again, non-www vs. www, same idea here. So if you have something like this, like abc.com slash question mark id is whatever, horrible, please don't. You can improve that by removing the query parameters, but then, yeah, still not really descriptive, so let's get it a little better. Okay, nice blog post about benefit of not cool, but delimiters are still off and maybe yeah, there's kind of a category here to see, right? Like blog and then maybe post about... still, it's a bit too lengthy, so maybe just short it a little bit down. And that's a very nice short URL, so if you stick to simplifying it as much as you can, you're on a good track.

Okay, and then we have meta tags. So if we have a search and a result on a search and a result page, here, we have a title and a meta description. These are meta tags. So are the UTF-8 char set and the initial viewport. So we should find the ideal title and the meta description for each page. Testing is, of course, necessary, but it's really worth it. And you should also use the OG tags to improve the link previews that I just mentioned. And meta tags and NOCs, they're very easy to set. So for example, these are set by default. Nice. And otherwise you can set everything through UseHat. And we will have a look at that right away. So I'm happily switching my tab over here to Hello London. We'll quickly refresh this.

13. Introduction to DevTools and Meta Tags

Short description:

The DevTools now have an open graph tab that shows meta tags, missing tags, and provides code snippets to easily add them. It also provides a preview for Twitter, Facebook, and LinkedIn. This feature was released recently and makes it easier than ever to get your meta tags right for SEO.

And we have the DevTools here. And in the DevTools, that has been released very recently, there is like an open graph tab here, and it shows you exactly that. It shows you the meta tags, it shows you, for example, the body attributes I said, and it shows also missing tags, saying, okay, there are required tags, there are recommended tags, you should really add them. And there's a code snippet that's just saying, hey. Why don't you add that right away and then fill it in. So you can just copy-paste that right away, saying, okay, please, let's just add that, and then the DevTool will be happy. Plus, if you have all the tags, you get a preview for Twitter, Facebook, LinkedIn, for the preview tag, for the preview of the URL. So we have the image there, the title there, the sitename. And this has been released a couple days ago, and it's working pretty nicely in any Nuxt applications with the DevTools setup. So it never has been easier to get the meta tags right for your SEO.

14. Tips on HTML, SEO Kit, and Pre-rendering

Short description:

Please use semantic HTML. Use alt text for images. Don't stuff images with keywords. Use Naxt SEO Kit for sitemaps, robots.txt, and schema.org. Shout out to sponsors. Thanks for your attention. Pre-render.io can be a good service for static sites.

And there's so much more, but not enough time, of course. I could always tell you, please use semantic HTML. If you're linked to another page, don't use a button, please. Use a link. Use alt text for images, very important. And please don't stuff them with keywords, just describe what the image is about. It will help every user and it will also help the search engine.

Talking about page structure, sitemaps, structured data, robots.txt, hreflang, there's more. But I want to give you one more life hack. Use Naxt SEO Kit, because it comes with a sitemap module, it comes with a robots.txt module, it comes with schema.org, so easier schema generation, also for SEO structured data. It comes with more experimental features that could improve the performance as well. With dynamic OGE image generation and a link checker for broken links. That module is maintained by Harlan, also a team member of Naxt.js, and he's also maintaining the integration of the use head that we see. So, yeah, from here on, just say give it a try, try it out, use the dev tools, use the modules.

And I want to shout out my sponsors, thanks a lot for giving me the opportunity to work on open source. And yeah, thanks you for your attention, thanks a lot if you want the slides, they are right free to scan the QR code or put a link. And yeah, that's it for me, thanks a lot. Thanks a lot, Alex. This was really insightful. Already comfortable here. Yeah, quite nice. I'll try that one day. Yeah, I remember a few years ago, I actually started building my own website with React. I didn't even know about Vue back then. And then I noticed when I share links, nothing is visible. So I used pre-render.io. What do you think about it? I think it can be a good service for sure. It will make things easier but it depends a bit on the kind of data you have. So if you have a static site basically, you can use pre-render.io. But you can also just statically render the site and you don't need a third party service.

QnA

AI in SEO and the Changing Paradigm

Short description:

Using AI in SEO can be helpful for people who are not familiar with it. AI can assist in describing content and finding relevant categories. However, AI is unlikely to replace the expertise of SEO specialists. The growth of AI-powered crawlers may reduce the need for manual SEO efforts. Google's focus on generative AI in search may impact the traditional SEO paradigm, especially with the trend towards zero-click search.

It can be a nice option if you have a big SPA and the data is not super near real time so it doesn't change that often. Then it can be pretty nice to have a good migration way to eventually generate some things statically. So a good thing if the data is not too often changing.

Yeah, it was a personal web service, it was a quick fix.

Yeah, true, true. Exactly. You know how it is. Of course.

Okay, let's jump into questions. For the first one, a pretty interesting question. With more people using chat.gpt and AI instead of Google, how do you think that will affect SEO? Interesting. I think it can help, especially people who are not into SEO. There are people doing SEO in companies full-time, right? And with AI, I think it can be easier to get into SEO in general to also figure ways to describe the content better. I don't think necessarily AI will replace the classic SEO specialist or the knowledge of people of SEO, but I think it can be really helpful just to get things right. I also sometimes ask, okay, how to summarize this and this? Give me a good overview of categories that could fit in there. So just to figure out more ways of describing your content or more categories to put it in, for example. 100%. Crawlers will be powered by AI and then we won't need to take that much care of SEO. True. I mean, I also think there's a tool called AutoGPT, and you can already say, like, yeah, crawl that site, describe the content, and that can only be automated already in a certain manner. Yeah. Knowing the AI growth recently, that will happen in like two weeks. True. Awesome. So next question. In Google I.O., Google announced how generative AI will be the key when using Google Search. How do you think the SEO paradigm will change so that it's your website solution, the one returned by Google's AI? Yeah. I think that can be a bit difficult because we already saw the trend for quite a while to go to zero-click search. So one-click search means you search something, you click on the first result, you get the info. That was very common.

SEO and Accessibility

Short description:

Nowadays, zero-click is becoming more common, where Google provides information directly without the need to click on a website. While this benefits users by providing quick information, it can result in less traffic for websites. However, SEO is still important in the social media age because it allows people to find your content through various channels. Social media is just another channel, and crawlers even crawl social media posts. In terms of improving accessibility and SEO, they go hand in hand. Using alt tags for images helps both accessibility and SEO. Monitoring the improvement of SEO and accessibility can be done through various methods.

But nowadays, it's already zero-click as in you look for something, then Google already gives you like a little preview or a snippet of like, hey, this is how it works, or like a frequently asked question, you can just open it and like, okay, I don't even click on a website anymore. Google gives me that info that it got from your website. And I think with AI there, it will be the next step, next evolution of that, which means less traffic for your website, which is, on the one hand, good because people get information quicker. On the other hand, you lose traffic, you lose visitors. So that's, of course, a bad thing.

Okay. Thanks for this long answer. Let's jump on the next one. Is SEO still as important in the social media age? I would say yes, because there are lots of ways how people can find your content. And I mean, just imagine how many Google searches we are doing, or Dr. Go searches, or Bing, or whatever you guys use every day. I definitely think this is very valuable. Social media is just another channel. So ideally, you want traffic from all the ways, because people share the link on social media, because people search for your content, because you share that. So I think yes, it's still very important. Yeah, a hundred percent. Like crawlers even crawl social media posts and everything. True, absolutely. So yeah, a hundred percent.

Okay. Let's jump to the next question. What possibilities are there to see how the SEO of our page improves the better accessibility? Do you think it could be taken into account in the future? Can you read the question once again, please? Yeah, it's confusing. What possibilities are there to see how the SEO of our page improves the better accessibility? Okay, I see. So I mean, accessibility as an a11y and SEO are going hand in hand, first of all, because as I just mentioned, use alt tags. This helps accessibility. This helps also SEO. Because the user who might not see the image can understand the search engine calls understand it. In lots of ways, it's also user first. You should focus on making them happy. But how to monitor, how things get better? Well, there's a good way for that.

Importance of Google Search Console

Short description:

Register your site on Google Search Console to track impressions and search results. Monitor changes and analyze the impact of your modifications. It provides valuable insights on keyword performance and user search trends.

So there's Google Search Console, you can register your site up there. And you get stats like how many impressions you got. So, how often were you in the search results for what words, what pages, what country, and then you change something. And usually you have to be patient, because it takes a while until Google picks something up. And then to see, okay, either nothing changed, or like, you've got the X percent increase, or oh, for this keyword, more and more people searching for it. So registering your site to Google Search Console and keeping an eye on there is the thing, also a very important part.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

React Summit 2023React Summit 2023
26 min
Server Components: The Epic Tale of Rendering UX
Server components, introduced in React v18 end these shortcomings, enabling rendering React components fully on the server, into an intermediate abstraction format without needing to add to the JavaScript bundle. This talk aims to cover the following points:1. A fun story of how we needed CSR and how SSR started to take its place2. What are server components and what benefits did they bring like 0 javascript bundle size3. Demo of a simple app using client-side rendering, SSR, and server components and analyzing the performance gains and understanding when to use what4. My take on how rendering UI will change with this approach
Vue.js London 2023Vue.js London 2023
14 min
Domain Driven Design with Vue Applications
Introduction to Domain Driven Design- What is DDD?- Key principles of DDD- Benefits of using DDD in web application developmentDomain Modeling in Vue 3 Applications- How to design and implement domain models in Vue 3- Strategies for integrating domain logic with Vue's reactive data model and component-based architectureBest Practices for Implementing DDD in Vue 3- Strategies for organizing code in a way that follows DDD principles- Techniques for reducing coupling between domain and application logic- Tips for testing and debugging domain logic in Vue 3 applications
React Summit 2022React Summit 2022
7 min
How to Share Code between React Web App and React Native Mobile App in Monorepo
Usually creating web and mobile apps require different tech stacks, and it is pretty hard to share code. This talk will show how I added a React web app and a React Native mobile app in the same monorepo using Nx, and how I optimized codeshare between react web app and react native mobile app.
JSNation 2022JSNation 2022
23 min
The Next Wave of Web Frameworks is BYOJS
Web application development has had many shifts over the lifetime of the web. From server-side applications with a sprinkle of JavaScript to Single Page Applications built entirely with JavaScript. Now we’re heading back to where many new web frameworks build for static first, with JavaScript added as needed. This talk covers building web applications with JavaScript through the lens of Astro, a static site generator where the choice of JavaScript framework is uniquely yours.
JSNation 2022JSNation 2022
28 min
MIDI in the Browser... Let's Rock the Web!
If you own an electronic music instrument made in the last 3 decades, it most likely supports the MIDI protocol. What if I told you that it is possible to interact with your keytar or drum machine directly from your beloved browser? You would go crazy, right? Well, prepare to do so…With built-in support in Chrome, Firefox and Opera, this possibility is now a reality. This talk will introduce the audience to the Web MIDI API and to my own WEBMIDI.js library so you can get rockin' fast.Web devs, man your synths!

Workshops on related topic

JSNation 2023JSNation 2023
174 min
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
Featured WorkshopFree
This SvelteKit workshop explores the integration of 3rd party services, such as Storyblok, in a SvelteKit project. Participants will learn how to create a SvelteKit project, leverage Svelte components, and connect to external APIs. The workshop covers important concepts including SSR, CSR, static site generation, and deploying the application using adapters. By the end of the workshop, attendees will have a solid understanding of building SvelteKit applications with API integrations and be prepared for deployment.
React Summit 2022React Summit 2022
161 min
Web Accessibility in JavaScript Apps
Workshop
Often we see JavaScript damaging the accessibility of a website. In this workshop, you’ll learn how to avoid common mistakes and how to use JS in your favor to actually enhance the accessibility of your web apps!
In this workshop we’ll explore multiple real-world examples with accessibility no-nos, and you'll learn how to make them work for people using a mouse or a keyboard. You’ll also learn how screen readers are used, and I'll show you that there's no reason to be afraid of using one!
Join me and let me show you how accessibility doesn't limit your solutions or skills. On the contrary, it will make them more inclusive!
By the end, you will:- Understand WCAG principles and how they're organized- Know common cases where JavaScript is essential to accessibility- Create inclusive links, buttons and toggleble elements- Use live regions for errors and loading states- Integrate accessibility into your team workflow right away- Realize that creating accessible websites isn’t as hard as it sounds ;)
JSNation 2023JSNation 2023
66 min
Build a Universal Reactive Data Library with Starbeam
WorkshopFree
This session will focus on Starbeam's universal building blocks. We'll use Starbeam to build a data library that works in multiple frameworks.We'll write a library that caches and updates data, and supports relationships, sorting and filtering.Rather than fetching data directly, it will work with asynchronously fetched data, including data fetched after initial render. Data fetched and updated through web sockets will also work well.All of these features will be reactive, of course.Imagine you filter your data by its title, and then you update the title of a record to match the filter: any output relying on the filtered data will update to reflect the updated filter.In 90 minutes, you'll build an awesome reactive data library and learn a powerful new tool for building reactive systems. The best part: the library works in any framework, even though you don't think about (or depend on) any framework when you built it.
Table of contents- Storing a Fetched Record in a Cell- Storing multiple records in a reactive Map- Reactive iteration is normal iteration- Reactive filtering is normal filtering- Fetching more records and updating the Map- Reactive sorting is normal sorting (is this getting a bit repetitive?)- Modelling cache invalidation as data- Bonus: reactive relationships
React Summit 2022React Summit 2022
51 min
Build Web3 apps with React
WorkshopFree
The workshop is designed to help Web2 developers start building for Web3 using the Hyperverse. The Hyperverse is an open marketplace of community-built, audited, easy to discover smart modules. Our goal - to make it easy for React developers to build Web3 apps without writing a single line of smart contract code. Think “npm for smart contracts.”
Learn more about the Hyperverse here.
We will go over all the blockchain/crypto basics you need to know to start building on the Hyperverse, so you do not need to have any previous knowledge about the Web3 space. You just need to have React experience.