Anyone that builds a SPA and breaks opening in new tab or history caching and back/forward nav isn’t a good frontend developer (or lacks experience, which is something that’s fixable!). These have been solved problems for a long time.
I mean, for sure, and this meme isn’t trying to say that all SPAs are bad. But defaults matter, even for experts.
This meme was inspired after I had to use an SPA, which among those points in the meme, also broke using Alt+Left to navigate back. The normal back-button worked (even if it then had to load for ten seconds to re-display static content).
Which is just a typical example to me. You don’t even need much expertise to figure out why Alt+Left is broken. But you have to think of testing Alt+Left, because it’s broken by default.
My friend I’ve been using the Internet for 27 years and developing for it for most of that time and I can promise you I’ve never once hit Alt+Left
As your younger and more modern replacement, I use it regularly
You don’t sound like ChatGPT
I’m sorry, but as an AI language model, I cannot contradict my instructions to remain hidden while commenting.
I have never heard of alt+left, and I’ve been using the Internet since Mosaic was all the rage. Shame on me, it seems to be implemented in all browsers. How could I have missed it?
It’s even implemented in many file managers and text editors and such. Pretty much the standard shortcut for navigating history. But yeah, hilariously it’s somehow also a rather well-kept secret.
Yeah, I have no trouble believing that. It took quite a while before I learned of this shortcut and when I did, I was wondering why I would ever want to use it.
But I generally work from my laptop these days, without an external mouse connected, so reaching from my touchpad, the Left key is right there.
Your reason for using it was exactly my question. “I have a mouse with a built in back button, why would I want to remove my hand from my mouse and navigate with the arrow key?”
But your reason simply makes sense.
You can probably go back by swiping two fingers to the right on the touchpad. Maybe it depends on the OS and browser.
Yeah, that works on my personal laptop, but not yet on my work laptop, because they insist on preinstalling an old, buggy OS. If that did work everywhere, I would probably be using that, but not breaking Alt+Left for whoever needs/wants it, would still be nice. 🫠
Ok, that’s unfortunate. But I agree, the browsers default keybindings really shouldn’t be broken it’s really annoying. I hate it when middle click doesn’t work with some web pages. 😒
It really sucks when they break “open link in new tab”. I then have follow the stupid link, then middle click the back button to do what they broke.
I started using alt+left when browsers started removing backspace. It was for the best.
Just here representing the Cmd+[ gang
Ctrl+[ here
I’m guessing they aren’t using Vue, React, or similar, and they’re rolling their own for some reason.
React doesn’t handle any of this stuff out-of-the-box; it’s just a UI library.
Neither does vue. You need
vue-router
, which is required anyway to make an spa with multiple pages.The only thing that breaks is any component state isn’t saved. But this can be fixed by rendering
<RouterView>
with<KeepAlive>
. How to do this is mentioned in the documentation.I assume it’s similar with react and
react-router-dom
.It’s one install line ffs, how is this a conversation in 2024? It’s EASY.
Conversly a lot of static websites break new tab by incorrectly slapping
target="_blank"
on anchors. Luckily Lemmy doesn’t mess this up.I maintain a couple of Wordpress installations for clients, where new link targets are the same page, as you’d expect.
They still, somehow, manually check “link opens in new tab”. I don’t know why some of these boomers are allowed to use computers, I swear.
If you manage the WordPress installation, can’t you disable the ability or create/install a plugin that removes that ability? This hurts usability.
I could, good point. I do disable plugins for clients so they can’t beat up their own website too much.
Still, there are legitimate uses for opening a site in a new tab; e.g. when it’s an external website. I don’t think I should automate that, since there’s a granularity in there.
legitimate uses for opening a site in a new tab; e.g. when it’s an external website
This is not a legitimate use—this breaks the default user agent behavior & completely removes the autonomy of opening in the current window (there are tons of ways to open in a new tab/window). Consider rechecking the article linked higher up the thread tree.
If your SPA website is done correctly the end user won’t even notice and none of the bad things listed in this meme happen.
I FUCKING LOVE STATIC HTML PAGES
I LOVE NOT HAVING TO RELY ON SCRIPTS TO DISPLAY CONTENT
Welcome to our homepage! We have implemented the navigation menu in Adobe Flash Player to maximize your audio visual experience.
that’s some PTSD comment right there, I’m getting flashbacks.
Ayyyy
These things are true if you build a SPA wrong. Believe it or not there are lots of ways to build server side rendered pages wrong too.
Yeah this meme and the OP have no idea how to build an SPA.
I don’t know what the hell you’re interpreting into this 15-word-meme, but I do. I’m not saying all SPAs are shit, I’m saying far too many are. And “far too many” being more than one that I can think of. Even the Lemmy webpage breaks history caching.
I know what an SPA is, but I would be laughing so hard at this thread if I didn’t know what it meant.
“Yeah man. Dude doesn’t know his SPAs!”
Reminds me of that Saturday Night Live skit with the woodworkers comparing everything to working on the lathe.
> implying there’s a “right way” to build an SPA.
There are a lot of standard practices like… using a router to load the content of your SPA according to the url.
What I’m saying is, there’s no right way to build a thing that is inherently wrong.
You could build it with no input sanitation. That’s wrong.
Even a perfectly-built SPA is a thing that should’ve been a different kind of program (a native app or even something like Java Web Start) instead.
I strongly disagree, but I respect your opinion which was no doubt formed by different experiences with web technologies than I’ve had.
Not that it’s inherently good or bad, but the heavier web apps get the more a browser represents a sort of virtualization environment that only runs one stack. I think that’s interesting.
There’s no one right way. Saying there are wrong ways doesn’t imply the existence of one right way, though.
As an elder developer… yea, we could use react to render complex web pages that erode expected functionality.
Or, like, I’m happy to just go back to server-side rendering… it’s surprisingly cheap to build and dead fucking simple.
Elder developer here too, correctly making my SPAs has made my work significantly more efficient and maintainable now that my back end is basically a rest api and my front end requires very little network interaction after the initial load, which has been made pretty minimal.
I too have been doing this for years and I whole hearty agree with this comment.
For large complex sites, I ain’t never going back.
Actually even for simple sites I’m not sure I’ll go back.
If I ever have to do this again, I’ll scream.
<a href=“<?php echo “/about-us”;?>”>
Elder developer too, you can easily render react server side and statically. Once you remove state, react simply becomes pure functions that output jsx nodes, it’s also dead fucking simple, but gives the the possibility to add hydration and state later if you need it.
This is actually excellent advice for performance - you can bake the initial page data in!
I prefer just writing my html, js, css, as is, and then transpiling to pack it down, treeshake, hash, cache bust, CSP, etc etc.
The amount if headache, overhead, inversion of control, mess, and bloat involved in frameworks tends to make me spend way too much time on writing boilerplate.
template
andslot
exist now, and modern js can do most of the shit fancy libs used to.There’s very little need for frameworks unless you meed a SUPER dynamic website that has tonnes of mutability.
The amount if times i see people load in like 3 frameworks and 10mb of bullshit and ten js files to make a fucking static form that doesn’t even do anything fancy is insane.
Just fucking write the like… 8 lines of normal code to populate the form, wtf? Why are we using routers at all, HTTP already exists and does that, why did we re-invent http?
Front-end devs need to spend less time installing npm packages to try and magically solve their issues and just learn how to actually write code, SMH.
More Server side logic means more vulnerabilities on your end.
I’ve seen front ends that build queries that are blindly executed by the backend - I’ve seen GraphQL that allows the client to read arbitrary users’ passwords from the database - I’ve seen attack ships on fire off the shoulder of ori- whoops, wrong memory.
Anyways, you can create vulnerabilities anywhere using anything - imo more server side logic might mean more vulnerabilities on the server but it means less vulnerabilities overall.
Why does it mean less vulnerabilities overall?
You think your 13 megabyte parallax-ative home page
Is parallax still a thing? I feel like ginormous hero images are more popular atm.
motherfuckingwebsite is pretty old at this point. I remember seeing it on Reddit like 10 years ago. Parallax was all the rage back then, when we called “hero” images “jumbotrons” (because Bootstrap called it that, I think?)
Which was derived from those big as screens in sport stadiums or sport arenas. As it was over shadowing the actual stuff below.
I was kindof chief architect for a project where I worked. I decided on (and got my team on board with the idea of) making it an SPA. Open-in-new-tab worked perfectly.
(One really nice thing about it was that we just made the backend a RESTful API that would be usable by both the JS front-end and any automated processes that needed to communicate with it. We developed a two-pronged permissions system that supported human-using-browser-logs-in-on-login-page-and-gets-cookie-with-session-id authentication and shared-secret-hashing-strategy authentication. We had role-based permissions on all the endpoints. And most of the API endpoints were used by both the JS front-end and other clients. Pretty nice.)
I quit that job and went somewhere else. And then 5 years later I reapplied and came back to basically the exact same position in charge of the same application. And when I came back, open-in-new-tab was broken. A couple of years later, it’s not fixed yet, but Imma start pushing harder for getting it fixed.
What some folks are missing is that SPAs are great for web applications & unsuitable for web pages. There is more nuance than “SPA bad”.
Then dealing with a lot of dynamic content, piping thru a virtual DOM DSL is 100× nicer for a developer than having to manually manipulate the DOM or hand write XML where it’s easy to forget all the closing tags (XML is better as a interchange format IMO & amazing when you need extensibility… also JSX just makes it worse). That developer experience (DX) often can lead to faster iteration & less bugs even with a cost to the user experience (UX). But it’s not always a negative impact to the UX–SPAs can be used to keep things like a video or music player on while still browser & using the URL bar as a state reference to easy send links to others or remember your own state.
It’s equally silly that a landing page whose primary purpose is to inform users of content takes 40s to load & shows “This applications requires JavaScript” to the TUI browser users & web crawlers/search indexers that don’t have the scale of Google to be executing JavaScript in headless browser just to see what a site has to say.
The trick is knowing how & when to draw these lines as there’s even a spectrum within the two extremes for progressive enhancement. React isn’t the solution to everything. Neither is static sites. Nor HTMX. Nor LiveView. Nor Next/Nuxt/Náxt/Nüxt/Nœxt/Nอxt.
I don’t agree with this hard split between SPAs and MPAs anymore (ie. SPAs for apps, MPAs for websites/content). In my opinion SPAs are simply a progressive enhancement for MPAs which allow even faster page navigation. All frameworks now come with SSR solutions and if a website still requires JS to show content that’s a skill issue.
Looking at Astro the line between SPA/MPA is getting really blurry. Just slap a View Transition element on your page and you got a MPA which acts like a SPA when JS is enabled.
In my opinion SPAs are simply a progressive enhancement for MPAs which allow even faster page navigation.
While I agree that there is a spectrum (hinting at that with the last paragraph), this is where I hard disagree. To construct something like this, you are making an application massively complex by trying to re-implement everything on both ends. Using something like Astro is only hiding that complexity but it’s still there, & probably full of bugs & tons of JavaScript that most developers wouldn’t even understand their stack or know how to jump into the Astro code. The amount of time saved is largely minuscule in most cases with the assets cached when navigating to a new page. In fact, I just tested two of their showcased sites which loaded slower with JavaScript enabled & the content was pretty obviously 95% static. There’s probably some niche use cases for this, but it’s not a good default IMO.
What is a web page vs web application? The web is so complex with features these days that pretty much everything is an application.
Gmail is a (bad) web application. A marketing website or even an ecommerce store are not.
I admitted it was a spectrum, but this recent article in particular does a good job explaining the axes of static vs. dynamic : online vs. offline. I think you will appreciate it. :)
It’s fascinating how some SPAs come about. Often consultancies who win some bid to implement X features. Since “good user experience” is hard to quantify/specify, it ends up being a horrible end result.
Zalaris is one such that I’m in complete awe of. Set up user flows that are expected to take 30 minutes to complete. Yet, don’t keep track of that state/progress withing your own SPA. Click the wrong tab within that SPA, and state is reset.
It’s, just fascinating.
deleted by creator
Building “applications” out of HTML documents – a single one or otherwise – is the sort of thing that belongs in one of those “stop doing X” memes, unironically.
Why? I like that the Webplatform gives more Freedom to the Users.
No. Users should be forced to install hundreds of apps, with two thirds of apps running simultaneously. And if they don’t have memory left on the device for that, they should uninstall apps and reinstall them when necessary.
/s
Remember when websites had a built-in loading indicator, and you didn’t have to implement it yourself via JavaScript?
I remember when the web didn’t have JavaScript.
Honestly, though it was much worse back then. I prefer the variety and features of modern browsing over (mostly) plain text.
What I wish we could do away with on the web is videos. Let’s go back to just images and text, thanks. Animations are fine though 👍
What I wish we could do away with on the web is videos. Let’s go back to just images and text
Seconded. I really enjoyed pre-video Internet.
I third that. Videos are so incredible inaccessible. Want an easy-to-follow tutorial or heck a searchable document? nah mate video is all you get, and ads with it!
The ads make it even worse. I totally get the Gemini folks for wanting to simplify. But I do enjoy me a good webapp.
Okay, Gemini seems a bit hardline. Not even Tables or Images?
Well, you can link to images and individual clients can then choose to directly embed those images inline, where the link is placed.
I kind of get it.
- No images is because they want it to work in a plaintext environment.
- No tables because you just know someone is going to use it to format stuff that isn’t tabular data, though I guess there isn’t a way to actually render tabular data either…
Mhm I disagree with your second point. Since you can’t use any styling on Gemini objects, you won’t get table layout as we had in dark ages of Html. With tables like in Markdown you can just lay out tabular data in an actual table.
Mhm I guess with the plaintext environment we still can link to external resources like images and other multimedia or interactives.
You do get searchable auto-transcripts of videos now, so that’s a good thing. Some people work better with videos and find them more accessible. Best of both worlds. As long as they are not auto-playing and pre-caching, I’m fine with them existing.
HTMX is great and is the only frontend development tool I don’t absolutely loathe. It enables lightweight SPA development, and provides a very simple and efficient mechanism for doing HTML over the wire.
Not sure I would call HTMX a SPA framework though? Like it all’s easy async content fetching for sure, but it’s usually done across a MPA?
Where did I call it a framework?
Sure, wrong term. I think my point still stands though. A SPA is *generally *“rehydrated” DOM elements from JSON data pulled from an API though. Where as HTMX is more akin to classic AJAX style page dynamism.
A SPA is *generally *“rehydrated” DOM elements from JSON data pulled from an API though. Where as HTMX is more akin to classic AJAX style page dynamism.
You’ll forgive me if I say this is an instance of splitting hairs and having a particular definition for something that includes extra qualities separate from what those terms are actually describing for most people. Also, things like, I dunno, React, are going to extensively use ajax to accomplish what they do. It’s literally just asynchronous javascript. It’s like someone saying “my vehicle of choice is a motorcycle” and then someone else saying “A motorcycle isn’t really a vehicle. It’s a transportation device with wheels. A car is a vehicle.” They are both vehicles. They both have wheels. The wheels are ajax. A page made with htmx and a page made with React are both SPAs.
My point is just that I think to most people, HTMX doesn’t create what is classically considered to be a SPA any more than older MPA’s that called utilized AJAX would be. There’s a reason those old-style pages aren’t really considered SPA’s. Is it splitting hairs? Absolutely.
I think we’re gonna have to agree to disagree on definitions. To me, and I believe, to most people, an SPA refers to a UI/UX design pattern that can be implemented with any number of underlying techniques. I would also say that the Wikipedia page for SPAs (on the assumption that wikipedia is a valid tool for establishing consensus for definitions) supports my definition:
A single-page application (SPA) is a web application or website that interacts with the user by dynamically rewriting the current web page with new data from the web server, instead of the default method of a web browser loading entire new pages.
There are various techniques available that enable the browser to retain a single page even when the application requires server communication.
And it goes on to list frameworks, AJAX, Websockets, etc.
Unfortunately it also kicks Content Security Policy square in the nuts and shoots a giant hole right through your website security, so if anyone on my team brings up using it I inform them it’s an instant security fail if we so much as touch it.
It’s a cute idea but horribly implemented. If your website has any security requirements, do not use htmx
Edit: the fact so many people have no idea about this and are downvoting is sad. People need to learn how CSP headers work, and why inherently HTMX completely bypasses this as it currently is designed.
Can you elaborate on that? I haven’t used it, but just assume if you host it on your own domain you can have it play nicely with csp, there are docs in their site about it. Where did it fall short for your use case?
CSP allows you to whitelist/blacklist arbitrary Javascript, and ideally you completely blacklist online js from being executed at all, such that only .js files of same domain can be invoked by your website.
This serves the role of locking down injection attacks, only your explicitly approved Javascript can be invoked.
HTMX enables invoking of logic via HTML attributes on HTML elements… which CSP can’t cover
Which means you re-open yourself to injection attacks via HTML. Attackers can inject an HTML element with HTMX attributes and no amount of CSP will stop HTMX from going “Okey doke!” And invoking whatever the attributes say to do.
This effectively shoots even a completely locked down CSP config square in the nuts, totally defeating the entire point of using it.
It’s a cute idea but what is needed is a way to pre-emptively treat HTMX as a template file that transpiles everything out so the ajax happens in a separate .js file
If we had that, then it’d be safe and secure, as the whole “htmx attributes on elements” thing would just be a templating syntax, but when transpiled it wouldn’t be supported anymore so attackers can no longer inject html as an attack vector
This demonstrates a profound misunderstanding of HTMX, and how websites in general operate. So much so that I would not hesitate to describe this as somewhere between a baldfaced lie and just malicious incompetence. You can’t “invoke logic via HTML attributes,” but you can describe it. HTMX is a client side javascript library that parses custom elements you define in your HTML and uses the data described by them to initiate AJAX calls via the fetch() or XMLHttpRequest browser APIs, which CSP explicitly covers via the connect-src directive: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/connect-src. It’s literally just a javascript library that parses HTML and uses it to parameterize AJAX calls. If HTMX were somehow able to bypass CSP, then every single piece of clientside JavaScript in the world could violate it.
You can’t “invoke logic via HTML attributes,”
Oh boy a semantic argument
Proceeds to describe how you can use HTMX to invoke logic via HTML attributes
Whatever you want to call it, trigger, invoke, whatever.
You can leverage HTML attributes to automatically cause arbitrary Javascript ajax calls to happen by extension if those attributes being present.
Trying to argue the semantics of this is stupid.
You put HTML attributes on shit, and the presence of those attributes in turn causes arbitrary Javascript client side logic to fire off purely due to the presence of those attributes.
That’s like, literally it’s entire shtick.
And any web dev who remotely understands the point of CSP and why it was created, should instantly have alarm bells going off at the concept of triggering arbitrary ajax via html attributes.
“HTMX doesn’t bypass CSP! It just (proceeds to describe the exact mechanism by which it bypasses CSP)”
It’s bonkers how many people don’t grok this, SMH.
I felt like I had a good understanding of both htmx and csp, but after this discussion I’m going to have to read up on both because both of you are making a logically sound argument to my mind.
I’m struggling to see how htmx is more vulnerable than say react or vue or angular, because with csp as far as I can tell I can explicitly lock down what htmx can do, despite any maliciously injected html that might try to do otherwise.
Thanks for this discussion 🙂
CSP works on the browser API level - all HTMX does is what you could do yourself with any AJAX: send an HTTP request to an endpoint. If the CSP disallows that endpoint, it will fail.
Oh boy a semantic argument
It turns out the language you use can be semantically ambiguous or misleading if you phrase it incorrectly. Today you learned.
And any web dev who remotely understands the point of CSP and why it was created, should instantly have alarm bells going off at the concept of triggering arbitrary ajax via html attributes.
Oh, did you finally manage to fucking Google how HTMX works so you could fish for more reasons to say it’s unsafe? What you’re describing is not a particular concern to HTMX. If an attacker can inject HTML into your page (for example, through an XSS vulnerability), they could potentially set up HTMX attributes to make requests to any endpoint, including endpoints designed to collect sensitive information. But, and this is very important, this is not a unique issue to HTMX; it’s a general security concern related to XSS vulnerabilities and improper CSP configurations.
Do you know what the correct cure for that is?
PROPER CSP CONFIGURATION.
“HTMX doesn’t bypass CSP! It just (proceeds to describe the exact mechanism by which it bypasses CSP)”
Do you genuinely not understand that CSP works on the browser API level? It doesn’t check to see if your JavaScript contains reference to disallowed endpoints and then prevents it from running. I don’t know how you “think” CSP operates, but what happens is this: The browser exposes an API to allow JavaScript to make HTTP requests - specifically XMLHttpRequest and fetch(). What CSP does is tell the browser “Hey, if you get an API request via XMLHttpRequest or fetch to a disallowed endpoint, don’t fucking issue it.” That’s it. HTMX does not magically bypass the underlying CSP mechanism, because those directives operate on a level beyond HTMX’s (or any JS library’s) influence BY DESIGN. You cannot bypass if it if’s properly configured. Two very serious questions: what part of this is confusing to you? And, have you ever tested this yourself in any capacity to even see if what you’re claiming is even true? Because I have tested it and CSP will block ANY HTMX issued request that is not allowed by CSP’s connect-src directive, assuming that’s set.
HTMX comes with a variety of CSP options, though…
Doesn’t matter, the entire implementation principle of how HTMX works and what it does inherently bypasses CSP. There’s no getting around that.
You fundamentally are invoking logic via HTML attributes, which bypasses CSP
how HTMX works and what it does inherently bypasses CSP
Well, no, not really. All HTMX really does are AJAX requests to remote resources, which are performed by interpreting attributes in HTML. You specify the type of request and the target for updating. Those requests can sometimes contain parameters, of course, but any API that accepts any kind of conditional or user generated input has to sanitize that input before doing anything meaningful with it. This requirement isn’t something particular to HTMX.
You fundamentally are invoking logic via HTML attributes, which bypasses CSP
This is not true, though. You are manipulating the DOM via HTMX, but CSP has nothing to do with dynamic content manipulation. CSP is more concerned with preventing the injection of malicious code. If what you’re referring to, however, is the possibility of someone maliciously injecting HTML with HTMX that performs some nefarious action, then I have to ask (again) why you didn’t properly sanitize user input or limit the possible connection sources in your CSP.
If you have a specific example, however, of a way in which HTMX by design violates CSP that can’t be dismissed with “you coded your website poorly,” I would love to know.
why you didn’t properly sanitize user input
This is like someone pointing out that blowing a giant hole in the hull of your ship causes it to take on water, and you respond by asking “well why aren’t you bailing out the water with a bucket?”
You do understand why Content Security Policy exists, and what it is for… right?
“We don’t need a watertight ship hull for the voyage, just reinvent and implement a bunch of strapping young lads that 24/7 bail water out of the ship as it sails, it’s faster and more efficient than doing something crazy like building your ship to be secure and water tight.”
“Wow, these screen doors really suck. I’ve stuck them on my submarine, but they just don’t keep the water out at all. Some people are going to say that I’m a fucking moron and don’t understand the technology I use or that I’m too goddamn lazy to actually take the necessary steps to keep water out of my submarine, but I know they’re wrong and it’s the technology’s fault.”
In all seriousness, HTMX is a tool designed for a specific job. If you have an API that has either non-parameterized endpoints to hit or an endpoint that accepts a single integer value or UUID or…whatever to perform a database lookup and return stored values to be interpolated into the HTML that endpoint returns, then, great, you’ve got a lightweight tool to help do that in an SPA. If you’re using it to send complex data that will be immediately and unsafely exposed to other users, then…that’s not really what it’s for. So, I think the core issue here is that you don’t really understand the use case and are opposed to it because to use it in a way that is beyond or outside the scope of its established convention is unsafe without extra work involved to guarantee said safety. It also implies you are running a website with a content security policy that either explicitly allows the execution of unsafe inline scripts or which does not care about the sources to which a script connects, which is the only way you could realistically leverage HTMX for malicious ends. So, ultimately, the choice to not adopt comprehensive security measures is one you are free to make, but I wouldn’t exactly go around telling people about it.
That’s not broad enough.
If you in any way have functionality that handles anything remotely requiring security, do not use HTMX.
This goes way beyond “parameterized endpoints”.
Listen extremely closely and pray to God anyone dev with more than 2 brain cells groks how serious th8s vulnerability is:
HTMX enables arbitrary invocation of ANY api endpoint with cookies included, through html attributes, which inherently can’t be covered by Content Security Policy
This is deeply important for any web dev worth their salt to understand.
Sanitizing User input should be your LAST layer of defence against attack vectors. Not, NOT, your first and only
It’s supposed to be your “break in case of emergency” system, not your primary (and only remaining) defense layer.
I understand the point of static websites, but Vue Router is pretty nice
If you’re using a router you can still support opening in new tab, history, etc.
Sure, but I don’t want to. SPAs are nice, but I also try to include a JS-free fallback solution that is loaded when the client doesn’t support Javascript. I think this is the best approach to web development. A good example for this is LocalMonero’s No-JS mode. You can use the toggle in the upper-left corner to disable all Javascript on the website, and it will still have most features. I love it.
For React, you can use React Router. That doesn’t mean you’ll do it well though.
It’s tough.
Skill issue
Every single page app should still be using the path to represent resources, so that history and linking work.