We see this in our open-source community. We've had a community channel for over two decades, where community members help newcomers and each other solve problems and answer questions.
Increasingly we have people join who tell us they've been struggling with a problem "for days". Per routine, we ask for their configuration, and it turns out they've been asking ChatGPT, Claude or some other LLM for assistance and their configuration is a total mess.
Something about this feels really broken, when a channel full of domain experts are willing to lend a hand (within reason) for free. But instead, people increasingly turn to the machines which are well-known to hallucinate. They just don't think it will hallucinate for them.
In fact I see this pattern a lot. People use LLMs for stuff within their domain of expertise, or just ask them questions about washing cars, and they laugh at how incompetent and illogical they are. Then, hours later, they will happily query ChatGPT for mortgage advice, or whatever. If they don't have the knowledge to verify it themselves then they seem more willing to believe it is accurate, where in fact they should be even more careful.
> In fact I see this pattern a lot. People use LLMs for stuff within their domain of expertise, or just ask them questions about washing cars, and they laugh at how incompetent and illogical they are. Then, hours later, they will happily query ChatGPT for mortgage advice, or whatever. If they don't have the knowledge to verify it themselves then they seem more willing to believe it is accurate, where in fact they should be even more careful.
The AI companies have taken all the wrong lessons from social media and learned how to make their products addictive and sticky.
I’m a certified hater, but even I’ve fallen into the exact trap you’re describing. Late last year I was in the process of buying a house that had a few known issues with a 30 day close. I had a couple sleepless nights because I had asked ChatGPT or Claude about some peculiar situation and the bots would tell me that I was completely screwed and give me advice to get out of the contract or draft a letter to the seller begging for some concession or more time. Then the next day I’d get a call from the mortgage guy or the attorney or the insurance broker and turns out, the people who actually knew what they were doing fixed my problem in 5 minutes.
This _is_ all true but what's also true is that there's an historical pattern (in many communities) of "n00bs" not being or (at least) _feeling_ welcome. So, I can't say I blame people for spinning in circles with LLMs instead of starting with forums or mailing lists where they may be shamed or have their questions closed immediately as "duplicate" or "off-top" (e.g. SO).
I think if we want newcomers to lead with human interactions, the onus is on us community leaders/elders/whatever need to be a little warmer, understanding and forgiving. (Of course, some communities and venues are already very good about all of this and I'm generalizing to make the larger point.)
Personally this type of behavior played a large part in why I left 2 oss communities.
A lot of the passerbys nowadays feel like trolls. They come in copy pasting chatgpt responses spamming they need help instead of chit chatting asking questions. We fix their problems, they don't trust us or understand at all. Or worse we tell them their situation is unreasonably bad and they should start over, they scream at us about how some unimaginably bad code passes tests and compiles just fine and how we are dumb.
They tell us we don't need to exist anymore in one way or another. They try to show off terrible code we try to offer real suggestions to improve it, they don't care. Then they leave the community once their vibe/agentic coding leaves that part of their code base. Complete waste of time, they learned nothing, contribute nothing, no fun was had, no ah-hahs, just grimey interactions.
I’m subscribed to a couple of mailing list and follow the archive of a few others. I wonder if the friction associated with the medium is why I haven’t seen those shenanigans?
URL parsing/normalisation/escaping/unescaping is a minefield. There are many edge cases where every implementation does things differently. This is a perfect example.
It gets worse if you are mapping URLs to a filesystem (e.g. for serving files). Even though they look similar, URL paths have different capabilities and rules than filesystems, and different filesystems also vary. This is also an example of that (I don't think most filesystems support empty directory names).
Yes, it is. The worst offenders hammer us (and others) with thousands upon thousands of requests, and each request uses unique IP addresses making all per-IP limits useless.
We implemented an anti-bot challenge and it helped for a while. Then our server collapsed again recently. The perf command showed that the actual TLS handshakes inside nginx were using over 50% of our server's CPU, starving other stuff on the machine.
Yeah, iOS is definitely a weaker spot and we're aware of it. Monal is currently working on an overhaul of their UI (they have a grant allocated for it: https://nlnet.nl/project/Monal-IM-UI/ ).
There is also a new app in the works between Cheogram and Snikket. There is a beta available, but it's still young (and we won't apply any Snikket branding until E2EE is complete).
Be aware that this post has known issues that the author is not interested in fixing. In their own words (in response to clarifications by one of the OMEMO folk):
"I'll make an edit later about the protocol version thing, but I'm not interested in having questions answered. My entire horse in this race is for evangelists to f** off and leave me alone. That's it. That's all I want." [censorship of profanity mine]
You won't find this quote in the article with Ctrl+F, it's in the screenshot, where they omitted the original constructive comment by one of the OMEMO contributors that they chose to moderate, which you can find here: https://www.moparisthebest.com/tim-henkes-omemo-response.txt
So, by all means, read the blog post. But just be aware that its ultimate goal was not to be an unbiased accurate technical article.
The post is an opinion piece and not a technical article for sure, but I'm not sure the takeaway from that quote is that the article is inaccurate, but rather they aren't really looking to start a conversation but rather state their opinions. It seems they've made multiple edits where they believe there are inaccuracies.
FWIW, I personally think Henke is correct to state that creating "...a product based on XMPP+OMEMO that, exactly like Signal, can only communicate with other Signal users and always has encryption on." would largely address most of the critiques (or at least the ones that bother me most), but that Soatok is also correct in concluding that the XMPP ecosystem and the way OMEMO is used in clients today does not meet their definition of "Signal competitor"[0], which I think is still a useful way to frame things.
I'm the founder of both the Prosody and Snikket projects. Sorry about triggering alarms :) I can try to explain...
Prosody is a popular choice of XMPP server software. It's used for all kinds of stuff, from self-hosted chat servers to powering Jitsi Meet, to Internet-of-Things applications.
Prosody is extremely flexible, and has a bunch of configuration options that allow you to adapt it and extend it however you want. For some people, this is ideal. Those people should continue using Prosody.
Snikket has a different scope. It is specifically an answer to a question like "How can I easily make a self-hosted WhatsApp/Signal for my family/friends using open-source software?"
- Snikket contains Prosody, for the core chat part. But it's Prosody with a very specific configuration, and the configuration is part of the project, it's not intended to be modified by the person deploying Snikket. They only need to provide the domain name.
- Snikket also includes additional components that a modern chat service needs. For example, it includes a STUN/TURN server to ensure that audio/video calls work reliably (again, preconfigured).
- Snikket provides its own apps, which are tested and developed in sync with each other and with the server. This avoids the common problem of incompatibilities that occur when you have an open ecosystem such as XMPP, where different open-source project developers may develop features at different paces, leaving users to figure out which ones support which feature. It also solves the discoverability and decision fatigue for users (searching "Snikket" on an app store will get you an app that you know is compatible with your Snikket server, you don't have to go through a list of XMPP clients and figure out which one is suitable).
- Snikket servers are not designed to be open public servers (these are an administrative nightmare). Instead, your server is closed and private by default. As the admin, you choose who signs up to your server by sending invitation links. The invitations also serve to simplify the account setup process - no need to prompt users to "choose a server", etc. They just need to provide a username.
Projects such as Conversations differ by running a single public server (conversations.im) and guiding people to sign up on that server, or choose one of a long list of free public XMPP providers. In some cases that's all what you want. But onboarding a group of people that way is not fun (for example, they all have to share their addresses with the group add each other to their contact lists one-by-one - Snikket makes discovery of contacts within the same server automatic).
Beyond these things, Snikket is all open-source and XMPP. But there is a focus on making a good polished and secure "product", if you like, rather than supporting the entire diverse XMPP ecosystem which includes a range of software of varying quality (weekend projects and more recently, 100% vibe-coded clients). For example, Snikket servers require certain security and authentication features which some older codebases that have fallen far behind modern XMPP standards (think Pidgin, etc.) simply don't support today.
> it’s actually all based on prosody and conversations?
As mentioned, I develop Prosody. I also collaborate with the Conversations developer and other XMPP projects. There's nothing shady here. The goal is just to make a best-in-class XMPP project that solves one particular use case (and it was primarily my own use case to begin with of course - I wanted to move my family off WhatsApp).
Ohh, wow. First off, thanks for prosody, been using it for several years, ever since I switched from my early 2000s jabber.org account to selfhosted.
And yeah, I get what you are saying, I'm using it the same way you envision snikket, just for my wife and I. Considering how much time I spent on the initial setup, I can very much see wanting a preconfigured version.
I guess the site was just too "non technical" and went over my head when I tried to grok it (before, a while ago, and now before writing the comment), the lack of a download option for the client on the snikket site combined with repeatably talking about invites just rubbed me wrong.
As I have already setup my server, and have gajim/conversations (which afaik are the best modern Windows/Android clients, for Windows probably even the only one storing modern xmpp) for desktop/mobile, I have no need for snikket, but my view now went from negative to very positive ;)
I'm still experimenting with the messaging on the Snikket website. However my general approach with the site was to pitch Snikket to people who don't know what XMPP is, which is, frankly, the majority of people. Instead, I wanted to focus on explaining features it enables rather than protocol details. But I'm aware it has caused a lot of head-scratching among people who already know Snikket uses XMPP :)
I see Snikket as kind of a gateway into the XMPP ecosystem for people who are unfamiliar with it. After all, if you're already familiar with XMPP then the chances are you'll probably be happier with Prosody or ejabberd, and you'll already have opinions about which clients you want to use (e.g. the upstreams of Snikket).
Yes, definitely. To me, the idea of a chat server that doesn't federate is as absurd as setting up an email server that doesn't federate. I understand that today people know more contacts with email addresses than XMPP addresses, but if we ever want to free ourselves from the current walled gardens, we need to stop treating chat as something that only happens in walled gardens.
Some people get worried about the idea of "federation", thinking that it somehow means their server is less private, and their data is being spread across a mesh of servers, and stuff like that. That's true in some decentralized/distributed chat protocols, but not in XMPP. Connections between servers only happen on-demand, similar how when you send email between different email providers, they will connect to each other to deliver the messages.
However we do have a feature which allows disabling federation access for specific accounts, for example to prevent kids from communicating with anyone outside their own Snikket server. This is a feature I want to expand on, so that you can permit communication with a limited number of approved contacts on other servers.
I am the current lead developer of Pidgin, and would like to reinforce the level of collaboration in the XMPP world. Even with Pidgin being very far behind in XMPP (and everything else) everyone has been very helpful as we're trying to catch back up and answering questions about the prosody instance we run ourselves.
This is a great explanation; Prosody/ejabberd seem to kind of be "everything to everybody" but because they are so general it's hard to know if they're a good fit for any one particular purpose.
Snikket seems to just be a focus or lens on Prosody that answers that question for the mission statement you gave.
As the founder of both projects, explaining the difference between the two projects is roughly 20% of my working day (okay, not quite 20% but sometimes it feels that way).
They weren't "HTTPS certificates" originally, just certificates. They may be "HTTPS certificates" today if you listen to some people. However there was never a line drawn where one day they weren't "HTTPS certificates" and the next day they were. The ecosystem was just gradually pushed in that direction because of the dominance of the browser vendors and the popularity of the web.
I put "HTTPS certificates" in quotes in this comment because it is not a technical term defined anywhere, just a concept that "these certificates should only be used for HTTPS". The core specifications talk about "TLS servers" and "TLS clients".
This is technically true, and nobody contested the CABF's focus on HTTPS TLS.
However, eventually, the CABF started imposing restrictions on the public CA operators regarding the issuance of non-HTTPS certificates. Nominally, the CAs are still offering "TLS certificates", but due to the pressure from the CABF, the allowed certificates are getting more and more limited, with the removal of SRVname a few years ago, and the removal of clientAuth that this thread is about.
I can understand the CABF position of "just make your own PKI" to a degree, but in practice that would require a LetsEncrypt level of effort for something that is already perfectly provided by LetsEncrypt, if it wouldn't be for the CABF lobbying.
> CABF started imposing restrictions on the public CA operators regarding the issuance of non-HTTPS certificates.
The restriction is on signing non web certificates with the same root/intermediate as is part of the WebPKI.
There's no rule (that I'm aware of?) that says the CAs can't have different signing roots for whatever use-case that are then trusted by people who need that use case.
> The CA/Browser Forum is a voluntary organization of Certification Authorities and suppliers of Internet browser and other relying‐party software applications.
IMHO "other relying-party software applications" can include XMPP servers (also perhaps SMTP, IMAP, FTPS, NNTP, etc).
It is a single member of the CAB that is insisting on changing the MAY to a MUST NOT for clientAuth. Why does that single member, Google-Chrome, get to dictate this?
Has Mozilla insisted on changing the meaning of §1.3 to basically remove "other relying‐party software applications"? Apple-Safari? Or any other of the "Certificate Consumers":
The membership of CAB collectively agree to the requirements/restrictions they places on themselves, and those requirements (a) state both browser and non-browser use cases, and (b) explicitly allow clientAuth usage as a MAY; see §7.1.2.10.6, §7.1.2.7.10:
Of these, (1) and (2) are already implemented in XMPP.
(1) just isn't that widely deployed due to low DNSSEC adoption and setup complexity, but there is a push to get server operators to use it if they can.
(2) is defined in RFC 7711: https://www.rfc-editor.org/rfc/rfc7711 however it has more latency and complexity compared to just using a valid certificate directly in the XMPP connection's TLS handshake. Its main use is for XMPP hosting providers that don't have access to a domain's HTTPS.
Yes, definitely. Prosody supports DANE, but DNSSEC deployment continues to be an issue when talking about the public XMPP network at large. Ironically the .im TLD our own site is on still doesn't support it at all.
Increasingly we have people join who tell us they've been struggling with a problem "for days". Per routine, we ask for their configuration, and it turns out they've been asking ChatGPT, Claude or some other LLM for assistance and their configuration is a total mess.
Something about this feels really broken, when a channel full of domain experts are willing to lend a hand (within reason) for free. But instead, people increasingly turn to the machines which are well-known to hallucinate. They just don't think it will hallucinate for them.
In fact I see this pattern a lot. People use LLMs for stuff within their domain of expertise, or just ask them questions about washing cars, and they laugh at how incompetent and illogical they are. Then, hours later, they will happily query ChatGPT for mortgage advice, or whatever. If they don't have the knowledge to verify it themselves then they seem more willing to believe it is accurate, where in fact they should be even more careful.
reply