-
Notifications
You must be signed in to change notification settings - Fork 601
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NIP-XXX Public Groups #483
base: master
Are you sure you want to change the base?
Conversation
I think voting is one special kind of group management. There should be owned groups as well. In one of his interviews @rabble says the group encryption scheme he likes best out of his database of social protocols is the one employed by P2Panda. Here is their spec: https://p2panda.org/specification/encryption |
Interesting point @bu5hm4nn. Voting prevents a single compromised key from removing the other administrators and taking over the group. Are there instances where removing the inconvenience of having to prompt other admins to remove an admin outweighs the compromised key threat? Regular nostr user admins may be prompted by one of their clients to vote on the proposal should this be widely adopted. By 'owned groups' you mean groups with a single admin account? This can be achieved by adding an admin group with a single member.
Thanks for sharing that link. I didn't even consider encrypted content or private group membership as it is so far out of the wheelhouse of the GitHub replacement use case. I think a model for safer encryption through nostr would need to come first before something like that could be attempted. |
Something I didn't make explicit in the text is that public keys can appear multiple times to increase their vote share. Perhaps a 'simple majority' should be replaced by 'at least 50%' and a 'super majority' replaced by 'at least 66%'. This would enable a better workflow for 2 or 3 administrators which I expect would be common place. A single user could have an offline key with 2 votes and their standard key with 1 vote. This would enable them to regain control of their groups if their standard key got compromised. |
I think this NIP is coming at the problem from the wrong direction (that's not to say it's ultimately a bad idea). The real challenge is not assigning group admins, it is determining an architecture that supports group privacy. The approach that emerges to solve that problem will dictate the level of trust required (and available) to solve these more peripheral problems. Access-limited groups aren't possible without either relay support or magic cryptography. The latter seems to be an open question for academic researchers, so for now groups have to involve relays in some way or another. Luckily, that's the whole point of nostr — to distribute trust across multiple relays. Once a set of trusted relays is selected, events published to a group would only go to those relays, and include a Once that framework is decided, moderation, member management, admin voting, trusted relay selection editing, etc. become much more concrete problems. |
That magic cryptography solution is... megolm?? I've been thinking the same things @staab, and same conclusions. |
This NIP should really be called public groups as 'groups' is misleading. Public and private groups require two very different models. I agree that a solid implementation of private groups would be extremely beneficial and would have wider application than public groups. I haven't been following the developments in private messaging closely enough to express a considered opinion. @staab, what you are suggesting is to significantly increase the level of trust placed in relays so they also act as control systems. Intuitively that feels like it goes against the trust-minimised philosophy of nostr. I'll ruminate on this...
I agree that this is the case for read access but not for write access. There are quite a few usecases that require validation that messages were issued by members of a group, who's membership is public. This draft NIP achieves this without increasing trust in relays or require them to add support for the NIP. This would also work in private relay context, where the group membership and message content are only accessible to those that could access the relay. Originally I was thinking this would be behind an enterprise firewall, but I suppose using auth messages to achieve this is exactly the model you are proposing. It would only be a matter of time before a relay was hacked and the messages revealed without the 'magic cryptography' that you speak of. If we had that, auth messages would just protect metadata (no bad thing). I'll update the NIP to call this public groups. |
I think it's actually precisely consistent with the philosophy of nostr, which is not to minimize trust, but to distribute it across as many parties as possible. In nostr, you're not any less reliant on 3rd parties than you are in traditional systems, you're just reliant on more of them, reducing the chances of data loss, but actually increasing the chances that your trust will be betrayed by some relay or other. I only mention this here because it's a very interesting concept to me. At the same time, you're right that in this scenario a relay's betrayal will have more impact, since the data is private. |
Doesn't this require consistent state to work? By which I mean everybody needs to see all the messages in order to accurately participate in the group? And not only see all messages, but also know they have seen all messages -- it's like a blockchain. That is what goes against the philosophy of Nostr, not what @staab said. I like @staab's approach much more. It is much simpler, easier to implement and more likely to work. |
This is also my main issue with this particular PR |
In the spirit of exploring interesting concepts , let me put to you the contrary position :) If we start trusting that a relay wont relay certain messages to certain parties or will only relay authorised messages, then we are increasing, rather than decreasing our risk exposure for every new relay we push to.
Yes. It doesn't require all the messages related to group activity but it definitely requires a specific small subset of messages; those related to group membership / admin changes. This is likely to be less than 1%.
It only needs to see a message sent after one it hasn't received to realise it doesn't have the complete state. That's a very big only. The context I had in mind for this was maintainers groups on code repositories where their may be 1 or 2 changes a year.
To be honest, I'm inclined to agree with you. You trusting that:
There is actually a prescient for a trust model like this on nostr in NIP-16 and NIP-33 replaceable events. I was trying to achieve distributed consensus without worrying about honey badgers and byzantine generals. Perhaps we should just inscribe the full events onto the bitcoin blockchain 😆 |
Agreed, I hadn't thought of it that way.
This is also somewhat true of @earonesty's encrypted group chat proposal #580. My feeling is that instead of baking the key management into the proposal, we should define a way to use gift wrap generically as a way of creating an "encrypted nostr subnet". More thoughts on that here. The keys for these subnets could then be managed in one of many ways, either using the key rotation strategy specified in these nips, or using an access-controlled nsec bunker as described here. |
|
Groups managed by evolving sets of Administrators with verifiable histories.
Created with the GitHub replacement usecase in mind. It is the first of the Key Challenges I have written about. Changes merged into a repository can be authorised by a maintainers group managed using this standard.
Whilst the rules around admin voting and removal may seem involved, I think the UX can be quite simple.