Now hopefully I can remember how to... Okay, there it goes. It went away. All right, so let's go back to my doc. I can close that now. All right, so we can either store it in the session, store the data in the database, or use local storage in the browser. Again, storing in the date session, it's the simplest. User navigates. You submit a form data as added to the current form session by page number. When the final page is submitted, you should have all the data needed to persist the data to your database. As we saw when we got to page four, all that data was there. And at that point is when you validate to make sure, did they fill in all the required fields. And if not, have that maybe links back to that page to fill those in. So the pros obviously with that are simple to use and the data persists as long as the session exists. As you saw before, I filled out that stuff last night, came back, and because it was stored in my cookie, all those fields actually still persisted. Cons, as we discussed cookie storage, the session can get pretty big depending on the data. If the session expires, then they lose that data. If they haven't finalized it, they'll lose that data. And the other con is you can't continue it from a different browser.
So the other option is to store the data in the database. One nice thing about that is that you can durably save the data. So you don't have to worry about the data going away if the session expires or what have you. And also because it's a centralized store, the user can come back and pick up where he left off regardless of which device they're accessing the data on. One of the potential issues with storing in the database is depending on your data model, you may have issues with required fields of relation. So if you have a model that has fields that are required and you wanna be able to save that but you only have partial data, well, your model is gonna reject that because it's gonna say, oh, that field can't be null, gotta have a value. So you either make your model have all those fields optional which is obviously not ideal or you have to create a separate data model just to store this transitional data. So it can get a little bit tricky or you could just have a simple thing that has I'm gonna store the data like session just kind of in a name, key, a key value store. So we just store the blob of JSON or whatever as the value. And then it's not until we go to the final page that we then save that into to the actual data models as we need because sometimes you have actual relations where you have to have like a project that has, you have to create a project before you can add team members to it. So if they have, if you haven't had a way to create that project, and then, you know, they're going to another page where they're processing the team members, then, you know, you may have issues because you can't save the team members because you don't have the reference to the project ID because that's not been created yet. So a lot of times what you'll do is you'll again, save all that data and just a key values store and then do the actual, your final models later.
And then the final option is storing the data data locally in your browser. And yes, you can use local storage. You can use index DB. However, I don't recommend that as a primary option. It would have to be a very specific use case where you need to support the user to be able to be offline. Because one of the issues with that is that once you are not directly involving the server with storing the data then you now have to go, you kind of revert back to the old method of maintaining local state. And that just adds a lot of additional code. So this is kind of, again, I would only use that as in specific use cases. And, in fact, in my example that I'll show, I have an option to be able to record video and I don't want the user to record 10 minutes of video and then find out when they go to try to save it that something's broken. So, what it actually does is as they're recording each of the blobs, the video blobs, get stored in IndexedDB and then when they stop the recording, then I bundle up all those blobs and send it to the server. So, again, those are specific use cases in order to... Because ultimately, you want to do what's best for the user, enhance their user experience. And, losing data is probably one of the worst things that a user can do, so. So, yeah, anything that you can do to mitigate that is always helpful.
All right, so let me actually show you. This is the actual project that I'm working on. And I'm not sure how well you can see this, but as you can see, this is using Remix Flat Routes. So this is my routing structure, so I can see everything at a glance, what all my routes look like. And this is the one with the message form. I actually split my route into two separate files, the main route file and then a Route.server. So all my server stuff, I keep in the route server. And the nice thing about this is that when I have, when I'm using Zod or schemas, that because it's all on the server, I don't have to worry about that, the Zod package, which is almost like 20 K. I don't want that to be in the client bundle. So by only using the Zod schema in the server code, I don't have to worry about it getting bundled. So let me actually, see if I can. Oh, yeah, this one doesn't have the port, auto port, oh, let me kill port 3000. All right, don't, you guys, don't look at that because it's not actually live yet. All right, so. Okay, here it is, so here's my collapsible forms, so I can click, why isn't it? Oops. Something isn't working, figures. I don't wanna go to the test one. Okay, so this is one where actually zoom in where I can drag pictures. Okay, so I can now drag pictures, and of course it's not working here, but I can drag and drop and it will go out and this is actually using CloudFlare images. I have an API call that will go out and fetch a private URL, a secure URL to upload directly to. I fetch that from my remix server and then when it comes back down, I get the URL, then I actually upload directly to CloudFlare directly from the browser instead of having to... I don't want to have the browser upload to my server and then have my server turn around and upload it to CloudFlare, but also you got to have with the whole keys and stuff, I can't directly upload to CloudFlare. So there's an indirection where I first get requests from their API, a URL that's specific to that upload. Yeah, signed URLs. Basically, yes, thank you. And so I do that for the photos, I'm not sure why it's broken right now. I also have the ability to record audio and then recording video. And this is the one where I was talking about where I record a video, it would store the data in local DB, index DB, and then would then upload. And does the same thing. Gets assigned URL and uploads to CloudFlare streams, which is pretty cool. Because CloudFlare images, if you haven't used it, it's kind of like Cloudinary and some of these other third-party ones, where you get, it will do all of the regenerate the image based on whatever your user's device is. And for example, in images, you can specify in the URL, like cropping and widths and stuff like that, and it will determine what formats the device can handle and will automatically generate images for that. Same with the video, you upload the raw video and then it will do all the conversions to the different formats that a device has. And depending on what their capabilities are, it will, you'll either get a high bit rate version or a lower bit rate version. So yeah. So if you, and I'm not trying to sell anybody on Cloudflare, but you know, they do have some great services and they're relatively easy to use. On the video panel here was.