### Transcription

Hey there, I'm super happy to be here and to be with the Git Nation team again in this excellent event of JS Nation and I am even more privileged and honored to be here with this excellent panel of experts. Evan Yeo, creator of Vue.js and Veit. Is that how you pronounce it? Veit or Veet? Veet. Veet, right. Okay, so I wasn't sure about that. It's French. Yes, that's why I asked. I assumed it might be French, but then the American in me wasn't 100% sure. Sean, aka SWIC, SWANG, head of developer experience at Temporal.io and the author of Coding Career Handbook, say hello. Hey, hey, hey. And Fred K. Schott, Snowpack author and who apparently had a very exciting launch yesterday, just yesterday? Yep, just yesterday. Good for you. Congratulations. And also hot on the heels of the Open Source Awards, which are near and dear to my heart as the cloud native and open source IEL community lead. So that's exciting. Congrats to all the winners. I'm really excited to be here. I think this is going to be a really great session. So folks, don't forget to drop your questions for our panelists. They're here now with us and you really want to hear their expert opinions on next gen build tools. So I guess the million dollar question for all of you, and we'll let each and every one of you first also introduce yourselves and give us a little bit of background about why you would know a lot about next gen build tools, but also what makes a tool a next gen build tool versus just a regular old build tool. So let's start with Evan. Sure. So I'm Evan. I work on Vue.js and Vite. And I guess Vite is considered a next gen build tool. That's kind of the slogan, but honestly, if I were asked to say define what really makes it next gen, I don't think there is a very clear or definitive line. I think existing so-called old gen tools like Webpack, given proper redesign or improvements, can incorporate some of the characteristics of next gen tools. So I don't think it's a very definitive like this is always going to be old. This is always going to be new. In terms of technology, it's always going to be a shifting landscape. But I guess in general, I think some of the trends I've personally been seeing is adoption of these new emerging standards, leveraging more from the platform itself like native VS modules, native dynamic imports. There are some interesting stuff that's going on in this standards body like new URL or import meta, like import.meta the URL, you can use that. And also like I think there are ongoing work trying to standardize CSS imports as well. So a lot of interesting stuff there. I think next gen tools in general try to play along with these native standards that's coming up and try to leverage as much native capabilities as possible. And another aspect is I guess next gen tools don't really limit itself to be strictly JavaScript, like as long as you serve the purpose for build tool, we can leverage tools, lower level tools written in other languages, or the tool itself can be written in other languages other than JavaScript. That's just my two cents. Okay, that's very helpful. Swix, I'll let you take it from here. Tell us a little bit about yourself and what you consider to be a next gen build tool. Yeah, I feel the most imposter syndrome here because I don't actually work on a next gen build tool. I more or less serve the role of JavaScript journalist. So I write about these tools and I try them out rather than actually build one. So a lot of respect to the other two who are here. I actually so I think the only reason I'm here is because I wrote this blog post about the third age of JavaScript, which I think Fred is also going to talk about. So I had a few definitions. They're ES modules first. They're collapsing layers. Sometimes we used to have this Unix philosophy in JavaScript where each tool does one little part of the tool chain. You can see a consolidation of jobs into a single tool, which is partially what Rome tools does. It's more type safe. It's more secure sometimes. It's polyglot, like Evan already said. I have this dream of calling it essentially scripting core and sorry, scripting shell and systems core. And that's a lot of what people are moving towards, too. And then some kind of isomorphic JavaScript coming in the form of things like Astro, which Fred is also working on. So there are a bunch of ideas all thrown in there. There's no overall theme apart from these are just newer and haven't been worked on before. So if you want to call the line of like next gen, I think these are the directions that people are exploring. Interesting. And you mentioned Rome. So we'll keep that in mind because that's going to be one of the next things we're talking about. But I want to let Fred also answer the question. So first, Fred, introduce yourself and also give us your little take on what an Fgen filter is. Yeah, definitely. I mean, I'm still laughing at Sean JavaScript journalist. That is just I've never had a title for what you are, but it's actually spot on. No, I mean, that blog post kind of nails where we saw our vision. So I've been working on a project called a snowpack for a while now. And before that, something called ES install. And now the launch yesterday was something called Astro. And all of this has been about exploring the space of what is the next gen build tool. And when we got started, it was it was kind of a lot of what's already been said, right? There's this real shift towards using more stuff in the browser, native browser, native JavaScript and relying less on all the tooling that you can, which was kind of, you know, to kind of give a dramatic summary of where we were coming from over the last decade was like Webpack will do everything. And so there's been a real move towards other tools to solve bundling using more compiled languages and wasm. You know, I don't think any of these tools would be half as impressive as they are if we weren't able to look like rest on the shoulders of ES build and SWC. So a huge shout out to them. They're kind of powering a lot of this at the lower level. And then, yeah, just like letting the browser kind of catch up over the last decade, we are now able to rely on these really cool native primitives in the browser, which just drops the amount of work you need to do as a developer. Your tool needs to do and you get a much simpler baseline of tooling as a result. That's really interesting. So I like the comparison to older tools. But since you touched on it briefly, I'll let you unpack that ironically a little bit and tell us a little bit about the differences. Right. So what you were using till now and what has been significantly improved now that you can adopt next gen tools? Yeah, sure. So I might be the most kind of like old man yelling at clouds on this because we started looking at this so early when it was just like a Webpack dominated space and now there's a ton more options. But at the time, this was back in like twenty eighteen, twenty nineteen, you just like couldn't import react in the browser. You had to like basically either use a bundler or use the U.N.B. build, which means you're like window dot reacting and window dot react. I mean, and doing this more manual approach, it's really just didn't have the same convenience of you going to Webpack and you're actually able to use some and you're able to use the syntax. It feels really modern and fresh. But behind the scenes, Webpack was like taking that and doing a lot of work to bundle it for you so that it would run in a browser that maybe didn't have that primitive that import, export syntax. And so, you know, a lot of my early days were like, no, you like don't need to do all this. But it was only because these primitives were now finally in the browser. And so, yeah, it dominated the last decade because it was really the only way to get a modern developer experience that actually ran in all browsers. But what we see now that we have these primitives is like some of just the basic core understandings of what a dev tool is are being revisited. So the idea of having to rebundle your site every time you would save or run a bunch of processing on large chunks of your site during development, those are being revisited. And a lot of them are going away in favor of what V8 and Snowpack both do, which is treating your individual files as individual builds and not necessarily having to. Well, Evan, I don't want to speak for you. I think Snowpack might go a little more dramatically in this direction. But thinking of even the bundler as a optional tool in your tool chain. And if you don't need to bundle, you don't care about that. You don't have to. You can kind of bring it in as you want to go when you're ready to tackle performance or to go to production. So all of these kind of fundamental things about how we build websites are being revisited for the first time in a while. It's a really exciting time. Yeah, it sounds exciting. Evan, I'll allow you to chime in here and tell us a little bit about how V8 is different, what V8 is focusing on, what's kind of in the road map in terms of taking it forward as an next-gen build tool. Sure. So for me, V8 started really with the itch that I have for myself because we built Vue CLI, which is webpack-based. And when I built Vue CLI, we had this kind of lesser-known command called Vue serve, which is a global command that just allows you to spin up a dev server for a Vue file without any configuration at all. You just need a file, and you can use that command to spin up a dev server. And I like that, but over time, it's still like, in order to do it, you have to install global Vue CLI. And the whole process is still slower than I would have hoped. I always wanted to have this thing where I can literally spin up a dev server in milliseconds and have something on the screen immediately. So when I saw browsers started to land native ES imports in the browser, and I was playing around with it, and I was able to actually just compile a Vue file on the fly in the request. And I had that proof of concept, but at that time, it really was just the proof of concept because around that time, very few people actually used native ES imports. And I was like, maybe this is just too early. But a year or two later, I realized, oh, all the major browsers today actually ship with native ES imports. Maybe it's time to revisit that idea. But at the same time, I was trying to figure out how hot module replacement would work over it because that's kind of becoming something that's kind of essential for large scale workflows. So I revisited the project, finally figured out how to do hot module replacement with native ES modules. And that was kind of the initial shape, how Vite took shape. And I realized this could be more than just a prototyping tool. So at a higher level, Vite is built on top of that. And we're trying to really to, I think the initial goal was to provide something closer in development experience to Vue CLI because we want to have a lighter, leaner, modern equivalent to Vue CLI for our users. But along the way, as we built more and more of the features, like move the equivalent features into Vite, we realized a lot of them are not Vue specific. They can actually help be used in other frameworks too. Like a lot of the stuff is shared, like how you handle CSS or how you want to actually build. In terms of build, Vite is a bit more opinionated because we want to have the tool that handles the dev server and the build process in the same tool, right? Because people are used to Vue CLI doing that. So we kind of want to carry over that characteristic. And eventually, we essentially extract all the things we learned in Vue CLI, combined with these new ideas, and came up with something that I think one of the interesting things is Vite is designed to be a bit more out of the box. So that existing users of Vue CLI or create React app can kind of move over a bit more seamlessly because it kind of caters to what they're used to, like the way certain things are supposed to work. I guess it's becoming sort of a shared convention among the tools. So we kind of do a bit more in that aspect. So yeah, I think at a higher level, we really want Vue to be something that you can actually ship production sites with today because we have existing users who are using a tool that we've built previously. They're already shipping stuff to production. So we kind of want our new thing to be able to let them actually move production sites over instead of saying, this is all just explorative. Don't use it for production. So we really spend a lot of time trying to make sure there's feature parity among these things. I guess, yeah, so in some way we have to make a lot of more pragmatic tradeoffs where say we really hope we can like bundle with ES build for production. But at the same time, there are certain things we still it's kind of hard for us to do with ES build directly. So we eventually opted to use rollup for production builds. So a bunch of tradeoffs we made are really trying to just make sure existing users can still get what they're used to, but still get some of the benefits of the new development models during development. Awesome. Really great work. So I'm going to actually because Swig mentioned Rome before and we don't have anyone from Rome on the panel, although they were invited. So we wish they were here. I'll let you take a stab at kind of taking an overview on Rome and how it differs from the tools discussed already. And then we'll move on to more stuff. So yeah, again, I don't feel qualified to talk about it, but I have talked with Jamie and Sebastian. And so Rome is a, it's not at the same level as VEETS or Skypack or Snowpack. It's more on the level of trying to replace a bunch of tooling. So if you look at their, they actually published when they announced their fundraising information as a company, they published their investor slide deck as well. So I would actually share that in the discord, but essentially they're trying to replace Babel, Webpack, Sass, Storybook, ESLint, Prettier, Stylint, Gulp, Jest, NPM, PostCSS, ESDoc, TypeScript, Cursor, etc. Oh, that's ambitious. Yeah. So kind of like the all in one tooling. And I actually kind of view it as like Node's answer to Deno in a sense of like Node has a very defined feature set right now, but it doesn't do a lot of the other stuff that we want, like formatting, like linting, where we typically have to configure all this. So you would take an existing Node application and slap Rome on top of it. I mean, it also does bundling. So that's where it competes. But its scope is a lot bigger and it also has zero dependencies. So they want to write all this from scratch. I think the other strong opinion that they have is they want to write it all in TypeScript. So there's none of the polyglot tooling thesis in here as well. So it's a very strongly opinionated framework. Right now they only do linting as far as I know. So that's the vision versus the reality today is still very stark. But these two people have some of the strongest track records in JavaScript. So it's definitely worth paying attention to. It's interesting that that's where you kind of wrapped it up because that's the first question that came in from the community. It's actually about opinionated versus configurable as a tooling trend. So can you unpack that a little bit? The question was really like Webpack and Babel are both super flexible, but ES build is intentionally not. Rome all in one. And you talked about Rome being a little bit opinionated. So a little bit about more opinionated build tools versus the more configurable and flexible ones. I'll let you continue that and then we'll move on to Fred. I think it's a false dichotomy, essentially. We had a brief period where zero config JS was cool. And then we were like, oh, actually, no, everyone needs to config stuff. So we'll try to have good defaults. I think that's what everyone wants. Yeah. I think so. Awesome. Fred, what are your thoughts? I was muted there. I feel like I'm cheating. I actually asked that question trying to just kind of see this idea. No, my name is different on Discord. So I promise that wasn't intentional. But no, I think there's something interesting going on here. I don't know if it's as concrete as a real trend, but there is this idea that you see this in Webpack and Babel. This full customization of a plugin can do anything. You can customize your build entirely. It's this really powerful platform. I'm not seeing that as much in the newer tools. ES build is really being intentionally where it's JavaScript. You can maybe compile something like Svelte over to JavaScript, but we're not giving you as much control as Webpack did. And I think over the years, it's one of the reasons these platforms are so popular. But at the same time, you still have people having trouble upgrading from Webpack 4 to 5, so much longer after its release. And Babel, it's a big ecosystem, but then it has these problems that maybe Rome is trying to solve and it's kind of all in one stack. So I think there is something that's maybe just a constant push, pull and tooling and then the web, but all that power, the customization and full control gives you ends up actually being a hindrance when it's taken to the extreme. And I know we see this with Snowpack a lot when people are asking, what does it take to migrate from an older Webpack app to Snowpack? The question is really like, it kind of depends how deep in the Webpack hole you've gone. If you've customized every possible import to do these things that aren't really JavaScripty, it's not really a Snowpack limitation. It's a limitation of using the browser primitives. You've dug yourself into a hole you need to dig yourself out of. So it's this interesting push-pull between the two. Interesting. We want to fit a lot more into this panel, so I'm going to ask you, Evan, just to keep your answer brief so we can move on to some of the questions from the community that are coming in. Folks want to ask this powerhouse team some questions. So quickly on this one and then we'll wrap up this one. Sure. Sure. Yeah. So my personal experience with V is kind of similar. I think it kind of plays with how Webpack initially started not just only targeting web apps. You can't actually use it to just bundle a Node.js package and ship it for Node.js. So it was designed to cover a much wider range of use cases, and it still do, which just increases the problem surface and its inherent complexity. It has to be that flexible to be able to handle all these different cases. But on the other hand, say Vito or Snowpack, we are kind of really just focused on the web where we're built with the assumption that you're trying to build a web app. Right. When you define the problem space to a narrow one, you can make more assumptions. You can sort of come up with more conventions to make it more streamlined because we know you are dealing with a specific type of problems. But still, even within that, there are always kind of this sort of you kind of have to figure out where the sensible defaults really kind of fall. It's only you can only figure that out over time. Right. When when people actually build things with your stuff and you realize what makes sense, what doesn't. Yeah, that makes a lot of sense. The next questions I'm going to just target to one or two of you at a time because I want to get stuff in. So next question that came up from the community. Do you think at some point we'll overcome browser limitations, for example, in terms of massive amount of module loading performance? I'll ask the second half of the question in a little bit. So let's let's answer that one quickly. Spix, what do you think? At some point is a very long, long time period. The V8 team has officially said that I think the limit is something like 100 or like there's just it's it's just like a physical limit as to how many you want to load in parallel. So I don't think so. But maybe the other guys can chime in. Go for it, Evan. Yeah, like my personal like I'm not too familiar with how V8 plans to solve it, but I think the bottleneck is really in the network, like even locally, say with Vidor Snowpack, we are no, we are no bundlers doing dev, right? But we're still loading all these modules over HTTP requests. There's still an overhead. And when when the number of parallel requests are large enough, like even with zero latency, you're still seeing a lot of overhead. So I think it's intrinsically difficult problem when you're trying to load this many modules over the network. So I'm not sure if this is something solvable in the short term. In the long term, it may have to require some fundamental innovation at the protocol level to to see this change. What an interesting segue, because that's the second half of the question. I'm not sure if you already answered it or not. But basically, the other half of the question was, we have HTTP, HTTP, two, a bit of a tongue twister, and NDSM. But at the same time, we still tend to be better off with bundling one package, sometimes with code splitting. What are your opinion is missing in the browser at ECMAScript standards to allow working modules out of the box without any performance drawbacks? We can see right now. So I feel like you kind of touched on that. So I'll let Fred expand on that a little bit. Yeah, it's I mean, it's it's something that everything is constantly getting a bit better. Like now we have HTTP three. And that's even better at this multiplexing of resources. So actually separating multiple files being loaded into almost separate streams within that connection so that one drop package doesn't tank the whole connection. There's a lot of stuff going on at the networking layer that's targeting, not really explicitly, but kind of unintentionally has just a nice impact here. Better on bundle loading performance. Now that's still not, you know, at the end of the day, 1000 files versus one file, it's going to be slower on that first load time, especially so it starts to become more of a game of tradeoffs, I would say, where if you're building something that's more of an application, right, like your your inbox on Gmail, and it's something that you go to, you have it open, you go back, it's like constantly in a tab and open, there is this really interesting performance idea where you're not really optimizing anymore for that first page load, you're kind of trying to optimize for someone who's a repeat customer, a visitor of the site. And then the caching story of a less bundled site kind of becomes an interesting part of this where if you can, I mean, looking at the extreme of every file individually served to the user, what that means that is that when one changes, you do have the opportunity to only ship that one change to the browser. So instead of resending an entire bundle, and every time you know, any engineer at Google pushes a change, you're at the end of rebundling the entire site and reshipping that every user because it blows up the cache of a single bundle, you get something that's much more flexible, there's a ton of complexity that I'm hand waving away there fingerprinting caching strategies, but it becomes less of a every use case, you have to do this thing instead a little bit more of a trade off based on your use case based on performance. And, and again, the networking stack continuing to improve in browsers really being in this kind of evergreen model where they're improving in a much faster clip than they did, you know, 1020 years ago. That's super interesting. I actually do want to get to the use cases, but I see that there are more questions from the community. So I'm going to ask the questions from the community. And I think we're going to wrap up with the question about like, kind of, you know, the use cases, the ideal use cases for which for which tool. So the next question from the community is, Vita is a new kid on the block for me. How does it compare to snowpack? Oh, all right, I'll let Sviks take that. Wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait, wait,