076 SNS

Quoting an anonymous Twitter user (got harrassed for these statements):

"Safari is buggy" is a valid criticism.

"Safari is behind Chrome in features" is not a valid criticism.

Never forget that the browser vendors, including Google and Apple, seized control of the web from the W3C. These few companies have too much power over the web, period.

1/8

The web has massive feature bloat. It's a privacy and security nightmare.

I personally think we should abolish JavaScript and not allow arbitrary remotely loaded code to execute on our computers.

"I want web sites to do everything a native app can do" is a suicidal mistake.

2/8

The more features that are added to the web, the less browser competition is possible! This is essential to recognize.

And Google knows it! That's the whole point.

Who can keep up with Google? Mozilla can't. Apple can't. Even Microsoft threw in the towel and adopted Chromium.

3/8

Imagine a small company trying to write their own web browser from scratch nowadays. It's just not possible! The web is so complex, there's no choice but to adopt one of the few existing browser engines: Chromium, WebKit, Gecko. That's it. The competitive landscape is bleak.

4/8

"Everyone has to adopt Chromium" is exactly Google's plan.

Who controls the dominant browser engine controls the web.

5/8

In a sense, there's no point in even having "web standards" anymore.

Web standards theoretically allow *anybody* to implement a browser engine. But if the "standards" are sufficiently huge, then practically *nobody* can implement a browser engine.

6/8

I've personally implemented software from scratch using RFC as a guide, in several different areas.

But a web browser engine? Forget it!

The "standards" now are nothing more than Chromium, WebKit, Gecko, and their individual quirks. How can there be a new engine?

7/8

The web is not "open" if nobody new can write a web browser engine. It's the illusion of openness.

8/8 Fin!

@alcinnz@floss.social this is very good thread, thank you for cross-posting ablobcatheadbang

the web browsers interact with is so bad
akko_badday

@alcinnz

Blame the webdevs. If people developed directly in HTML, this wouldn't happen and you'd be able to use your favorite "minimal" browsers. But they'd rather click a few buttons in a bloated "framework" and then also load a bunch of third party trash like recaptcha.

If people cared about how things work, and took some pride in designing something of their own, the big corpos would lose their power in a flash.

@digdeeper Yup, that is something to advocate!

However you can't let the education-industrial-recruitment off the hook either. Codecamps are focusing on the hot new framework from Meta or Google in their poor excuse for education, and recruiters are expecting these skills.

Nor the W3C for focusing near-exclusively on JavaScript APIs recently.

@digdeeper Yup, that is something to advocate!

However you can't let the education-industrial-recruitment off the hook either. Codecamps are focusing on the hot new framework from Meta or Google in their poor excuse for education, and recruiters are expecting these skills.

Nor the W3C for focusing near-exclusively on JavaScript APIs recently.

@alcinnz

Right. However, how much influence does a regular person have on what the big corpos are doing?

Until the systemic problems are fixed at their roots somehow, we can only do what we can. Which is, well, what I said earlier.

@digdeeper @alcinnz Like how I noticed on Thorchain's website.
See my rant about how they made it so HTML's "a" tags exclude "href" attributes unless you enable JavaScript:
https://social.076.ne.jp/notice/AIOjYsSsVyVNFqfpzM

@ryo @alcinnz

Wow. Insane.

@alcinnz In my opinion, much of the HTML5 spec is actually pretty useful. Most of the unnecessary bloat comes from JavaScript. I also don't think we should abolish js, sometimes a little scripting is needed. But it's way too overcomplicated, and many many features should just be dropped. Let it control the DOM, receive events, and then let it stay at that. That's all it needs to be able to do.

@Riedler I've got a lot of thoughts here, but right now I think I'd say that for filling such a role, I'd say the Sun's hyping of Java at the time led to some terrible decisions which no one likes.

To make it reasonable to reimplement the DOM the standards need to be taken to scratch...

@Riedler @alcinnz how about XHTML2 + HTMX ?

@epoch @Riedler That's where I'm going!

@alcinnz Firefox should've put funds into other Gecko-based browsers and an Electron competitor that uses Gecko. also Servo, which would have been a great browser engine

@alcinnz Mozilla should've put funds into other Gecko-based browsers and an Electron competitor that uses Gecko. also Servo, which would have been a great browser engine

@alcinnz Mozilla should've put funds into other Gecko-based browsers and into creating an Electron competitor that uses Gecko. also Servo, which would have been a great browser engine

If it's always going to change for the worse... or get rug-pulled... (making something more solid and ours long-term instead)..

@alcinnz

Forgive me it's morning where I usually have the best and worst ideas... as usual it needs a rethinking and looking at in different ways but this is my quick-best which I'm sure I'll revisit myself....

Perhaps it's not about heading for choice when it's always going to change for the worse in this kind of industry which makes us chase all the time and makes what is good to go bad and over-controlled / monopolised / bought out.

The aim might be / could more / should be (?) about making it *ours* whatever we do. Making development stick to this headline and perhaps even creating our own ways with what is less changeable and probably already exists. I'm thinking what is native to server as it's basic functionality.
We would then optionally install something for satisfying something else (for banking etc) so perhaps not an all-in-one design which demands a lot and has many deal breakers... and this is talking generally here not necessarily for this existing development exactly.
As food for thought designing or using something that already does what we do but perhaps improve the other sides (I mentioned a few things before like Advertising Education for Open Source world).
https://qoto.org/@freeschool/108121601509467853
Getting people's mind and togetherness right in a human sense and also technological (buying pine phones or librelaptops and helping that at same time etc), Tying it all.

Something that makes what we do make and design as interoperable as basics are (like files) or adjustably basic as possible so in the long term we don't get constantly rug-pulled or undone as big bits of our own work from the 'work' or others they simply hire (hiring skill-sets like mercenaries towards domination / monopoly / reducing effective opposition).

This maybe too premature or not precisely written, but doing work around what they frankly they own is always going to be on the horizon as everything else is a temporary thing unless it locks in something with teeth to avoid that or make it very costly or even financially rewarding users to snow ball that instead of destroy it.. which might be short term too. So this is what I mean when I say chasing; if they own / control things more than we then at some point we accept and head for something else (using google while closing all the reasons you use it) and trying not to give them ideas on pushing other things either. The whole system is theirs, yes we choose where to stand.. and again all is appreciated and I'm not claiming more than keeping to basics which I think we all are somewhat... but not expecting same-same or even similar products or services always or to feed their bad designs.

Eventually people are being politically kicked out of things, rug-pulled, oxygen-starved while people claim they didn't do it... so it means we have to accept less in long term, and perhaps with my designer hat on, much less.

Again, I mean something that stays right more often or works well already, something like what offline does or audio between us etc.

And then generally after to increment some blocks of things that are equally sold or basic, maybe just a reinterpretation of basic rather than big dependency where 1 thing breaks the complex features on top.
Adding a plugin or module for 1 specific task perhaps to try to achieve something in our world or our design or maybe for 'their' world as an equivalent or token attempt while still focussing just add-ons from existing basics *keeping* basics... which in the end we might just call pictures and text for the most part!

Again a quick post. which like everything needs 2nd...3rd..4th pass with people's input too...
Thanks for your work.

sensitive media

@alcinnz@floss.social if a web site should require native app functionality than there should be a native app for that functionality so the end user shall install it if they wish to.

@alcinnz@floss.social take java for instance , we use to be able to java in web browsers but now days you don't see it anywhere basically.
but it's still i think technically possible.
and the end user has to have java installed on their machine.
i think this would be an acceptable replacement for javascript.

@logan @alcinnz sounds like browser plugins in the 00s

@alcinnz I like the ease of creating beautiful things with CSS, and the hackiness of HTML, but my love for web technology stops there

@alcinnz I was thinking for a while to just make a web engine from scratch with no JS functionality at all.
JS is the biggest contributor to defacto planned obsolecense in both PC and smartphone hardware, and there is a slowly growing niche that prefers a no JS experience.
This is also the only real way to defeat the ever expanding JS-only web, even if it'll be used by only 2% of the entire planet.
replies
0
announces
2
likes
6

If it's always going to change for the worse... or get rug-pulled... (making something more solid and ours long-term instead)..

@freeschool I'm struggling to comprehend what you're saying, but while I'm confident I'm taking I'm happy for others to explore other fixes like Gemini!

I do think we need to give up on webdesign being fully interoperable...

@epoch@tilde.zone @alcinnz@floss.social perhaps?

idk exactly what it was called
shrugs
and sure the end user had to install something but that's the point, they had to install something
😺 they had to think about whether or not they want this code to run on their computers and than put in place things to make it work.

@logan @alcinnz no please not java that's even worse, running java code on your pc

@straw@socks.pinnoto.org @alcinnz@floss.social its less worse than javascript . if you do not want the functionality than you can just not install java.

@logan @straw @alcinnz That's kind of what WASM should be, but isn't.

If it's always going to change for the worse... or get rug-pulled... (making something more solid and ours long-term instead)..

@alcinnz

Indeed - I was a bit fast there, and fact I might have to say to myself to slow down a bit :)

As a 2nd attempt / summary I guess following the native architecture on servers and what works already is what I like and even want more of (FTP is one example perhaps better than anything else if you can stick to it's basics in certain contexts or balanced - I find Nextcloud an unnecessary layer for my basic usage). And then these basic ALL towards education
...but maybe I don't offer concrete solution beyond the computer basics and then human improvement / evolution.
Reverting back to those almost offline type things and existing server structures I liked the idea of.

Similar to what we use already offline that works and is 'interoperable' with other people's file and formats. But will give this one a rest. Thanks.

but sometimes I want to start a new engine. amone the existing ones I like Netsurf: it's written in C and has the most compact and robust realization, and their own renderer.

@iron_bug Same here!

@alcinnz there is a new web browser engine in development...it might replace gecko one day

https://servo.org/

@alcinnz
I agree.

I was really surprised that the Serenity-OS community finally passed the ACID3-Test. Never thought anyone would try. No real alternative or competition - their goal is to just DIY everything.

@xplore It kind of did! Mozilla Quantum was all about swapping Servo components into Gecko without regressing standards support.

I don't believe Servo's still active...

I want a C-written browser on SDL or EFL graphical and sound base, without JS (I think, the HTTP must be extended with active client requests for this (initially, JS was used for this and then it turned to cancer). and something more safe and lightweight can be used for scripting on servers. maybe lua or something, but I stand against user-side scripting at all. I cannot imagine a way it could be implemented in a safe way.

more problems is graphics and direct memory access that they made specially for browsers. I believe it was a deadly mistake. the most untrustworthy and buggy software has direct access to video bugffers in system. this is wrong, just wrong. and JS turns any browser into malware, factually. so the web as it is now is something evil and dangerous.

@iron_bug @alcinnz i wanna start a new engine too.

@alcinnz Safari is behind Chrome in some extremely useful features that would allow web apps to rival native apps, and it's strongly suspected that Apple doesn't support them because it would damage their role as rent-seeking gatekeeper to mobile devices.

No push notification support, only partial support for web manifests, no background sync, etc.

~> @alcinnz@floss.social I do not think JS should be abolished.

I think that it simply should not be allowed on normal webpages. There should be a different, seperate type of webpage that allows JS.

@forever @alcinnz That's basically how I browsed the web for a long time using NoScript. I'd turn on JS for web apps that I knew I could trust. Every other web site was prohibited from running JS.

@forever @alcinnz maybe a separation between web pages and web apps

~> @LunaDragofelis@embracing.space @alcinnz@floss.social this

yes. web pages ≠ web apps and it should be that way

web apps would have to be in a different environment

@forever @alcinnz @LunaDragofelis and then every website turns into a web app

@patricia
This thread 👆
Are you still trying to do this?

@alcinnz not only are the standards huge and incomprehensible, they are changing.unpredictably and often

@frank87 It's not the route I'm talking, I think the web could be salvaged. But I am glad there's those exploring an alternative!

@LunaDragofelis @forever @alcinnz been tryna figure put a way to programmatically figure out if a webpage is *not* a web app, *and* is safe to customize and use media queries on without fingerprinting. And I think I figured it out.

1. The page's CSP must have the equivalent of "connect-src: none" and a "sandbox" directive that has nothing more than "allow-same-origin" and "allow-downloads".
2. The CSS has no background images or @import directives
3. The page uses a secure cipher suite (no injection)
4. The HTML's image srcsets do not convey any information about the display or viewport size.
5. The HTML's "media" attributes do not conditionally load resources with more than two conditional branches. Loading a light or dark image is fine, for instance; too many options can reveal information that is too exact.

It's probably possible to make no. 1 more relaxed for forms; specifying a single specific absolute URL for a form endpoint should be okay.

For no. 2, I decided a blanket ban on CSS loading external resources is for the best. Inline CSS combined with external loading can tell a server if the user has CSS enabled or uses CSS overrides.

For number 4: even coarse imprecise information about a display is risky. Site A knows your display is under 1400 pixels, Site B only knows its above 1350 pixels. If site A and B share information then we suddenly have pretty exact information.

If all of the above check out, a web-lite user agent can load the page and treat it as a web document, forbidding it from making any requests after loading is complete. Otherwise, it can prompt the user to open in the default browser.

~> @Seirdy@pleroma.envs.net @LunaDragofelis@embracing.space @alcinnz@floss.social hmmm

i kinda disagree with the second point

on my personal website i
@import a style.css and have set up CORS on my other sites to allow that same file to be loaded

it's not much of anything

and on random.amongtech.cc the images are imported from codeberg

many things use external resources so um. yea

@forever @alcinnz @LunaDragofelis the problem is that not everyone is as cool as you, and some can misuse them.

It's better to just concatenate your files or only reference them from HTML, to reduce the scope of conditional loading.

I'm trying to do this so eventually someday the Tor Browser can figure out that it's safe to turn on dark mode, accessibility tweaks, and other customizations. I need to err on the side of safety.

@Seirdy @LunaDragofelis @forever There's a way to minimize the harm of loading CSS-imported media I implement: Load them before CSS styling rather than after!

Unfortunately to my knowledge I'm the only one doing this...

And there's a tradeoff in that this approach might load higher-resolution images than you need...

So yeah, not appropriate for your Tor Browser usecase...

@hobson Yes, it leads me to flinch whenever I hear the words "living standard"!

@alcinnz
Code is law. Here too.

@bob I'm personally taking the former route, there's a lot to love about HTML/CSS *mostly* as-is! For whatever reason there's already *plenty* of sites which adhear to a reasonable subset!

I wouldn't mind if webapps went full webassembly eventually asking webdevs to reimplement text layout, event dispatch, etc *inside* the sandbox. There's a possibility the web could evolve into something reasonable that way. But I'm not personally invested in the survival of webapps.

@logan @alcinnz and for the "you don't have to install it and you can access it on any computer" argument, it should be possible to just provide a self-contained executable

@iron_bug @alcinnz Modern GPUs (i.e. anything supporting WebGL) have enough isolation that malicious code shouldn't be able to do anything worse than a denial-of-service by causing GPU timeouts.

(But even that can kill the display server with some drivers.)

It's not like any GPU would have a vulnerability allowing modification (and possibly reading) of another process' graphics memory, right? Right?

(Hint: some do)

shit like "WebGL", "Webasm" (although it has nothing to do with assembler), remote running of untrusted code, DRM (I mean not the kernel feature but the proprietary crap), etc should not exist. this is a hole in security.
browser is the most unstrusted application on PC. it should be limiteb by the hardest virtual environments and have no direct access to any hardware at all.

@alcinnz I agree with most of the thread, but I think it’s a really good thing that a website can do most of what a desktop app can! There’s a far lower barrier of entry to making a website than to making a native app, and I don’t want to have to download a native app to buy something from an online store or join a social media platform.

Even mastodon is built primarily on the web, and it relies on the ability to execute (some) code in the browser to work!

@lectie A couple thoughts here: I do concede that the web's arguably the most ethical way to distribute apps to "normal" people.

At the same time I'm not impressed by your examples: Have you heard of Brutaldon? And there's plenty of precedence showing that all an online store needs to accept payment are webforms.

Gemini seems to differ, but I have nothing against webforms! Even if I would've designed those webstandards slightly differently.

@alcinnz I just looked up Brutaldon, and it looks interesting. That being said while I know server-side rendering *exists*, I’m not sure it’s actually preferable to webpages made interactive via JavaScript. Even Brutaldon uses JavaScript to “unobtrusively enhance the user experience”, when available.

Web apps are a cool, powerful thing. It’s cool that I can edit markdown or code, right in the browser, and see results

@alcinnz and I also remember really neat interactive demos that are only possible because we can run stuff in browser! See for ex: https://gravitysimulator.org/misc/earth-spoils-the-rings-of-saturn

lower barrier of entry? but who cares?
imagine you has come to a restaurant and they say: we have some noobs working here, please enjoy this unedible burnt crap. we has too little time to cook that. ot at a dentist clinic they claim something like: out dentist is too stupid and unskilled and thus we will pull out your teeth because we cannot fix them. this is something wrong, don't you find it?
then why users should consume something that is made by some stupid monkeys in software? this is not a service, this is crap.

@lectie All I can say is that in implementing my own browser engines I do plan to support the JS uses Brutaldon has without requiring JS, and that I do plan to have WYSIWYG editting as a browser feature rather than webplatform feature. Though I might turn that into a webform input type too if I can figure out a nice HTML-syntax that gracefully degrades.

This seems easier than implementing JS... Especially now CRDTs have been invented...

@lectie
I instead sometimes ask myself, why using online stores is not just a protocol. I could use my preferred online shopping application with the benefit of unified search and ui, instead of trying to understand a new webshop for everything.
@alcinnz

@iron_bug @alcinnz most businesses with a website aren’t in the business of having a website. I don’t care about the quality of their website, I care about the quality of their service (which ISNT the website).

Sure, a restaurant website or a website for a dentist might be crappy, but it’s much better than not having a website at all. It’s *good* that I can see phone numbers, menus, hours, and information.

nope. site is also a business and the quality of service. programmer is a profession. and user data leaks from different crappy websites is the proof. it should not be like that. non-profrssional noobs should not approach the development in production. one may train to cook or program (or anything else) at home. but when it comes to work - there should be a professional standards and requirements for hiring personel. nobody wants shitty service. and software is a service too.

@iron_bug @lectie Also I'd add: It's not the random dentist's site which tends to the problem in requiring JavaScript.

What ammounts to online brochures tend to be quite tame that way. Unless ofcourse it was made using, say, Wix.

Which would be a great reason for me to build WYSIWYG editting into a browser!

@iron_bug @alcinnz programming is a profession, yes. But many small businesses and organizations *do not have it in their budget to higher a programmer*. Nor should they need to. We cannot impose our professional standards on other fields. It’s like saying “everyone should be required to hire a graphic designer when making a poster”. No, that’s absurd!

@lectie @iron_bug Yes, but it's perfectly for us to minimize the harm they can do. Which we aren't really.

@alcinnz @iron_bug you’re absolutely correct! For plenty of businesses, a static site would work fine. And I agree a WYSIWYG editor would be great to have in a browser!

@alcinnz @iron_bug I agree, and we should be working to minimize the harm that can be done! The path of least resistance should be the one that’s most likely to provide a correct outcome for the average person looking to make a website.

they shouldn't. there're IT companies for this. IT should deal with IT. doctors should cure, tailors should sew, bakers should bake. and there's no need for "small businesses" to cook their bread for breakfast and have a personal tailor to sew a pants. this is senseless.

@iron_bug @alcinnz @ixn They just want to turn web browsers (more specifically Chromium) into an OS.
The end goal is to force everyone onto dumb terminals with only a monitor, keyboard, mouse, and a WiFi card (so no HDD/SSD/eMMC, RAM, graphics card etc).
Aka, PC as a Service, and WinDOS 10 and 11, and macOS Big Slur and newer are preparing you for that bit by bit.

@iron_bug @alcinnz I personally prefer a web where anyone can make a website to one where you have to hire a firm to make it for you.

Most websites need not be perfect. People cook for themselves, people clean for themselves, and people can make websites for themselves.

Besides: most people trying to make a website will either:
- end up with a static site, with no security risks, OR
- use a firm like SquareSpace (in which case the onus is on SquareSpace)

yes, it's a good idea. but greedy salesmen won't agree with this great idea: they need to shove their ads bullshit up to you and steal your data for selling out.

@iron_bug @alcinnz @lectie Small businesses exist to do 1 thing well, rather than dozens of things poorly.
So if you're a small business doing customer support, outsource your programming to a programming business for example.

yes, there're a plenty of ready-made solutions for standard use cases. including free and open source. no problem. and even for more complicated requirements 90% of cases are typical and software for them is already written, tested and works fine. it's pretty cheap if one orders it from professional developers.

most websites are crap. and here we have bullshit web that is bloated, slow, buggy, dataleaking and insecure. just because people think programming is kinda game they can learn "in 10 days". nope. programming is a skill that can be aquired in 10 years.

@iron_bug @alcinnz @lectie You can learn programming in a few months, but it doesn't mean you'll be good at it.
After you learned programming, there's still many years of improvement you need to make in order to be truly skilled.
Just saying as a programmer who has been programming in C and PHP for well over a decade, and I too still think I'm far from the best.

@iron_bug @alcinnz and in the second case, that’s actually the one where you’re *more* likely to end up with tracking software on the website. Same with firms: if you hire a firm to manage your website, there’s nothing preventing them from putting user metrics and other trackers on the site.

Expecting people to hire or use firms doesn’t solve the problem.

it's all bout agreements. if you write such a requirement then there's no problems. people pay money for software development and support and there's no need to do any extra features for same money, absolutely.

the trackers are the result of greed. people want "free cheese" but there're no such a thing as free cheese. free cheese is only in a mousetrap. if one wants a secure web he must pay for a server, get a normal professional software, and not look for 1 buck per month "service" at some centralized platforms that live on stealing data from their users.

@iron_bug @alcinnz I don’t know anyone who genuinely believes it’s possible to learn programming in 10 days. The only place I’ve ever seen stuff like that is from companies trying to sell cheap courses.

Besides, the web being bloated, slow, buggy, or insecure is much more the fault of large companies that *do* have programmers on staff, than it is the fault of individuals or small businesses.

it's business. if some company creates buggy bloated sites it will just go burnt out.
and I can assure you the worst things in web are things that are made by amateurs. the Fediverse is an example of really shitty web software. then I look at traffic in Fediverse - it makes me really sad. yes, I gonna make an optimized realization that won't ddos the other servers with useless repeated requests. but this won't help the problem: until all the instances are written by professionals it will be like that.

I can process 20 000 requests per second on a very humble server. and I need justa couple of megabytes of memory for this. it's trivial. the web realizations process barely a hundred of requests per second and devour 4-10 Gbytes of RAM and huge 8 cores CPU on servers. this is the difference in professional progamming and monkey scripting.

and I assure you if all the web was written by professionals we would have fast, clear, safe and stable web that would work thousands times faster and took thousands of times less resources.
I say this as a programmer with over 20 years of experience in system and network programming. I know the capabilites of the hardware and I see the tremendous waste of resources because of stupid monkey coders. this is just a result of incompetence and careless development.

@alcinnz Even in such a world, there's still a point to them: having a well-written description of semantics you can _expect_ from a browser (does not necessarily mean that you'll get them, but it's easy to argue that something is a bug when it contravenes the specification that it is supposed to implement).

good standard is dead standard! :)

@alcinnz It's still under development as there was a ommit yesterday but not at Mozilla but it's now supported by the Linux foundation.

But yes it's 10 years that it's under development so I think your question is still valid as it's clearly over complicated to build a web browser engine!

@alcinnz It's still under development as there was a commit yesterday but not at Mozilla but it's now supported by the Linux foundation.

But yes it's 10 years that it's under development so I think your question is still valid as it's clearly over complicated to build a web browser engine!

@alcinnz

From an Inkscapian/SVG perspective the W3C (i.e. google and mozilla) have been shitting all over our standards work for years.

Then when we get svg 2.0, they do *nothing* to implement it. Once again leaving Inkscape the only software to support a web standard that it couldn't get it's own needs into.

When people say re-write inkscape as html5 tool, they have this idea that svg is well supported in Chrome. It is NOT.

@doctormo I'm not surprised! Afterall, it took them forever to implement new form inputs! And CSS seems to be at a lower priority for them than JavaScript.

Btw, very soon I'll start/continue discussing the SVG renderer I'm planning to pull in for Haphaestus. Pure Haskell, uses a different XML parser than I am but renders to the same image type I plan to use otherwise. Thought you might be interested!

@alcinnz Of course!

SVG is a really hard spec to support, but I love to see new ones. Most might be tempted to just use librsvg... which would negate javascript and SMIL but would be better than Chomr eofr mesh gradients at least ;-)

@doctormo Well, I'm glad Rasterific SVG exists already!

Because otherwise I might not have ever gotten around to implementing it...

@alcinnz I've read a whole documentation for Gemini in about half hour.

techpost, SVG

@doctormo @alcinnz librsvg is quite capable but it’s typically packaged with a metric ton of dynamically linked libraries. Would be cool to see a “librsvg-lite” package.

SVG is a really hard spec to support, but I love to see new ones.

SerenityOS’s LibWeb’s SVG component, NetSurf’s Libsvgtiny, and resvg are some that I have my eyes on.

@doctormo

Sodipodi/Inkscape is such a success story over why XML rules and JSON drools, namely namespaces and schema. The generational paradigm shift to JSON has been such a NIH-driven nightmarish reinvention of everything but worse. (JSON made some amount of sense when you were deserializing your own known-safe objects, but the fact that i.e. ActivityPub is in JSON is messed up.)

Yeah, SVG as a web standard is not doing well.

SVG as Inkscape's format works well but it's a square-peg fit—everything (that has, for example, path effects) needs to be represented twice, both with all the Inkscape concepts and handles, and as viewable-everywhere plain paths. Kinda strange from the perspective of a platonic ideal vector serialization format. But, again, thanks to the life-changing magic of name spaced tagged tree data that is XML, at least it works. ♥

@Sandra @doctormo XML got such a bad reputation because in the late 90's people were screaming that XML was the solution to everything (similar to how some people argue that blockchains are the solution to everything today).

The solutions some people built were so horrendous that the backlash led to JSON, which ignores all the good things about XML, and even the things where it improves on XML is done poorly (unspecified numeric precision in an interchange format. what the actual F?)

@loke @Sandra @doctormo
Yeah, XML vs. JSON is comparing a Bazaar and a Cathedral.

And I would argue that people who can't do the difference should be taken as incompetent and should be blatantly ignored when it comes to choosing formats.