I’m setting aside the question of whether it is illegal, but I think it is legally murky anyways.
The more interesting question is whether or not it is ethical to sell one’s Reddit account.
I, for one, won’t be selling mine. Why?
I think selling accounts contribute to the problem of trust in sites such as Reddit and Lemmy. Why would anyone buy a Reddit account anyways? To appear trustworthy, right? Because karma points and account age has been used as a proxy measure for an account’s trustworthiness. An old account, with a fair amount of karma points is a valuable thing for misinformation and/or campaigns. Even if I hate Reddit to the point of wanting to see them literally burn, I still can’t support misinformation and advertisements.
Another thing to consider is giving up control of your account. I’m one of those who scrubbed and deleted my Reddit posts and comments. Selling my accounts would mean that if Reddit undeletes my posts and comments, I no longer can do anything about it. Worse, someone else is in control. That’s no good for me.
You make some real solid points and yes the ethics of it is what I would struggle most with. Do you still use an active Reddit account out of curiosity?
I haven’t been going there since last month (June 12). Since then, I’ve only gone to check on whether or not any posts or replies have resurfaced (undeleted by admins or otherwise).
Last night, however, I commented to inform people about the state of Lemmy apps in an effort to get one or two people on the fence to switch over.
I like your style. Thanks for the reply. I will still occasionally spend a little time there on my lunch breaks myself. Maybe I should try to spread he word as well.
Presumably because all the good usernames are already taken.
Because karma points and account age has been used as a proxy measure for an account’s trustworthiness.
For the sake of moderation effort, that is pertinent, yes. It is true that a new account is more likely to spam than an old, active account, so some moderators will choose to only allow the latter to post to make their lives easier, and this circumvents that. But if someone is foolish enough to give their free labour for Reddit’s gain, let them work like dogs. Why should anyone else care?
is a valuable thing for misinformation and/or campaigns.
Only if moderators stop doing their job. However, Reddit has made it pretty clear if moderators stop doing their job they will be kicked to the curb. It is doubtful that there is any material difference here.
Perhaps you are implying that the Reddit crowd doesn’t have the wherewithal to avoid the fallacy of authority, but if you truly think they can’t even work through a simple logic exercise, there is no hope no matter how hard you try to babysit them. At that point they are already certifiably braindead. Let them go.
Good point regarding usernames. I’ve forgotten about that.
But if someone is foolish enough to give their free labour for Reddit’s gain, let them work like dogs. Why should anyone else care?
I hate to say this, but are you willing to extend this reasoning to moderators here in Lemmy? They’re also giving their free labor here, right? Or is it different because “no corporation gains” here?
I’ve never been a moderator over there, nor here, but I’ve gotten a close look at what it entails more than a handful of years ago, in a relatively small forum. The most surprising thing for me is that even there, content moderation already is a headache, and a thankless one at that. Whatever that could make that job a little bit easier will be used. Karma points and account age are, as I’ve pointed out earlier, used as a tool to make one’s job as a moderator easier. Do I think it’s warranted? No, personally, no. The best way to deal with each report, each person, is on a case-to-case basis. But when you’re swamped with a lot of reports and untrustworthy people (to the point that the starting point is any one person being untrustworthy), I would understand why anyone would take a little shortcut here and there.
My argument against selling accounts is never really focused on Reddit, but I called it out by name because of its size and influence.
For a bit of a background, I am Filipino, living in the Philippines which has become a testing ground for the weapons of mass disinformation during the last few national elections. The report Architects of Networked Disinformation focused on Facebook, which is where most of the troll farms have operated on. However, I have no reason to believe that the same techniques can’t be applied to places such as Reddit (or even here in Lemmy). Furthermore, while focus during the 2022 national elections have also been on Facebook, there is reason to believe that some activity has also happened elsewhere, particularly Reddit where supporters of Leni Robredo (who ended up second in the race) have congregated to avoid the toxicity that is Facebook.
So, does mean I’m already a veteran when it comes to facing disinformation? No, hell no. However, my own personal experience, albeit colored through personal biases, has at the very least given me at least one insight: misinformation is insidious and isn’t as obvious as one might mock it as. It isn’t as simple as “Vote for XYZ, because he will bring us to a new golden age, make ABC great again!” It isn’t always, in your own words: “a simple logic exercise.” While it’s easy to point out and laugh at the failures, it’s just that: failures. To quote from a section that highlights the experiences of community-level fake account operators:
They mention that their ultimate failure as fake account operators on Facebook is when they are called out as a fake account (“That’s game over! That usually shuts us up”).
Yes, of course. It’s considered a failure to be outed as a fake account operator. It would be “a simple logic exercise”, to extend this reasoning to why a fake account operator would be a lot more careful and be more subtle in their operations, perhaps even utilizing the best (and worst) of human psychology and sociology to use our very own intellect, feelings, and connections against us. This exercise is left for the reader.
Now, all of the wall of text you’ve hopefully endured thus far, leads to one point, which I’ve already alluded to in my earlier reply: Selling accounts feeds the misinformation machine. I cannot, in good conscience, allow myself to feed with my own hand this very machine that has done a lot of damage to countries all over.
My apologies if I have been combative in my tone, but I hope you understand that it’s coming from a place of … trauma, I guess?
I hate to say this, but are you willing to extend this reasoning to moderators here in Lemmy? They’re also giving their free labor here, right?
Sure. Although presumably they also own the instance, so it is free labour like clearing your own laneway of snow is free labour, being the benefactor of the capital ownership. That is different to volunteering to clear the snow in the Walmart parking lot.
Selling accounts feeds the misinformation machine.
Only if the moderators can’t keep up due to the aforementioned volume problem. The “trustworthiness” of the account has no bearing on the “trustworthiness” of the message that gets delivered. A professional surgeon exclaiming that the world is flat wouldn’t make it any more believable than a homeless person standing on the street corner. The person delivering the message is irrelevant, hence why it is a logical fallacy to try and ascribe value to a person’s character in this regard.
My apologies if I have been combative in my tone
No need to apologize. Tone has no bearing on the content you are delivering. If I cannot separate the two, I have failed as a human and can be lumped among the braindead.
No need to apologize. Tone has no bearing on the content you are delivering.
Thank you! It’s really been a breath of fresh air to be able to assume that the person I’m talking to is arguing in good faith.
Now, to address your points.
As far as I’ve understood Lemmy and how federation works, moderators and admins are two separate things, with their own separate responsibilities (and hopefully) with their own separate (albeit overlapping) mechanisms and toolsets for content moderation. But that’s mostly besides your point which is “it is free labour like clearing your own laneway of snow is free labour.” I think we’re on the same page there. The dynamics are a little bit different here, but I would argue that the challenges are still similar enough.
Another point that I would like to briefly mention here is the incentives one might have to moderate a community here, or in Reddit. It’s the desire to cultivate a community of like-minded people.
For all the faults of Reddit has, I think its greatest edge (and greatest weakness, in other respects, in my opinion) is that it provides a ‘one-stop-shop’ for people to find a community of like-minded people. That by funneling a huge amount of people into one place, it makes it easier to find other people interested in, for example: prostitution in late medieval Constantinople (a facetious example, but I hope that illustrates my point well enough). It’s easier to find people to establish a community for really niche subjects when you’ve a lot of people in one place. This also increases the desire to nurture that community, even to the point of withstanding abuse, not just from users, but also from admins and whoever else.
A point that I have probably not addressed well enough in my previous reply is the challenges of content moderation everywhere (from small forums, to Reddit, to Lemmy). I hope you’ll agree with me that it is a huge challenge especially in places with a large amount of activity. Sifting through each report, making sure that you’re making the best judgement call for each case, is not an easy task.
This brings me to your next point, which I admit I’ve sidestepped a bit: “The “trustworthiness” of the account has no bearing on the “trustworthiness” of the message.” I agree, in principle, but in practice? I doubt it. A good portion of the report I’ve linked earlier has described the amount of work that goes into making a fake account trustworthy. Even if we assume that moderators scrutinize every report, checking the account’s activity history, and even (in the case of FB), going through the trouble of cross-checking friends and mutual friends, it would be a difficult job separating fake accounts, from legitimate, albeit similar-looking ones. Furthermore, the time and effort that would go into this would, assuming limited time and resources—especially if we’re talking about volunteer community moderators, detract from other moderation tasks that need attention as well. Given a glut of accounts for sale, it would be far easier for troll farm operators to just buy more accounts, disposing of “exposed ones” as they go.
I do hope I’ve addressed your points satisfactorily. It’s been a pleasure discussing this with you thus far.
Well, I don’t know if I’d go that far. By common definition, argument implies that one wants to persuade another to share in their views. Truthfully, I have no care as to what kind of views you hold. There is no value proposition in trying to persuade you. That is not the value of discussion.
moderators and admins are two separate things
Yes, it is true that you can metaphorically “shovel the parking lot at Walmart” on Lemmy, but my assumption was that anyone who has an interest in moderating would also take ownership of an instance. That said, it seems we are moving past the question of: Who cares if they have to work like a dog? One does not become a volunteer moderator if they don’t like the work. Perhaps providing more moderator duties is actually providing them a service?
I agree, in principle, but in practice?
I took that you already suggested that some people just aren’t capable of understanding the seperation, and I acknowledged that as being a real, possible scenario. But I also questioned how far you can really go by babysitting them. At some point you have to accept the lost cause, no?
If the likes of you and I are doing our jobs properly, our world should be robust enough to handle those with such disabilities and other life challenges. Are you suggesting that you and I have failed?
There is no value proposition in trying to persuade you. That is not the value of discussion.
In most cases, yeah, I’d agree. In this particular case, it can be argued one way or the other. Someone can see us and say “megane-kun is trying to convince me not to sell my account,” for example. Personally, I agree with you on that one point in this instance. I think we’re just two people exchanging views on a certain question. But in certain cases, where persuasion leads to action, it’s different. And this is actually pretty much intertwined to the topic at hand. It is this potential for action that makes persuasion valuable in certain cases.
I took that you already suggested that some people just aren’t capable of understanding the seperation, and I acknowledged that as being a real, possible scenario. But I also questioned how far you can really go by babysitting them. At some point you have to accept the lost cause, no?
I’m arguing that, no, in certain instances, and in the cases that are most relevant to the question of selling one’s account, that there are people at work in making this separation really hard. That the job of a troll farm, the most likely buyer of one’s account, is to make that distinction be very hard for community moderators and ordinary people alike (even more so for the latter), and the effort to make the distinction incredibly hard as to give up on it, or make sloppy shortcuts (account age, karma, etc). Why do all this? To influence others to do certain things, be it in politics (voting for one person over the other), or in advertising (buying one product that you don’t need).
Personally, however, I agree. Some people can’t just be helped, and that it’s unfortunate that this kind of people could be a majority. However, since I know with certain confidence, that there are bad actors at play that working overtime such that the likes of us would fail, always, I have to just be a little more understanding, and be aware that I myself can fall victim to such machinations.
I suppose the crux of this discussion is this: I am arguing that selling my account helps the disinformation machine, you’re arguing that it doesn’t matter, that the responsibility lies in the person imbibing information.
If that’s the case (and that I’m understanding you properly), well, personally speaking, I see no reason why it can’t be both: be responsible in imbibing information, but be aware that there are bad actors at play (and don’t lend these bad actors a hand).
I think we’re just two people exchanging views on a certain question.
Wouldn’t you say that writing comments on forums is a solitary activity? I mean, there is good reason to believe you exist out there as another person, but is that not just an implementation detail of the software? Would it make any difference if the software quietly replaced you with something akin to ChatGPT? The answer is no. If done well, I’d never notice. The value is not in the exchange with another person either.
Don’t get me wrong. I do believe there can be value in exchange with other people. But when one seeks that they go outside. This environment is quite different; it is very much designed around the individual.
But in certain cases, where persuasion leads to action, it’s different.
I didn’t mean to suggest that argument cannot happen, but I don’t see the value in it. I don’t find enjoyment in changing random interest stranger’s (or ChatGPT model’s) minds. And when I’m spending my precious free time, it had better be enjoyably spent. If they are “wrong”, that’s their problem. One needs to take some relaxing downtime just for themselves now and again.
that the responsibility lies in the person imbibing information.
Well, I suggest that the responsibility lies on us to create a world where people struggling with misinformation is not an impediment. I get the appeal of trying to sweep them under the rug. It unquestionably makes things easier, like not needing to accommodate those in wheelchairs makes things easier. But must we take the easy road?
My argument has always rested on the question of why someone might buy an account. Whatever it is that we enjoy online, in places such as Lemmy, Reddit, or whatever, is besides the point.
I get the appeal of trying to sweep them under the rug. It unquestionably makes things easier, like not needing to accommodate those in wheelchairs makes things easier. But must we take the easy road?
I’d say it’s not even a problem of someone sweeping things under the rug, but an intruder throwing dust and trash all around.
I think I’ve already said my piece here, and as you’ve said: “Truthfully, I have no care as to what kind of views you hold. There is no value proposition in trying to persuade you. That is not the value of discussion.”
I’m setting aside the question of whether it is illegal, but I think it is legally murky anyways.
The more interesting question is whether or not it is ethical to sell one’s Reddit account.
I, for one, won’t be selling mine. Why?
I think selling accounts contribute to the problem of trust in sites such as Reddit and Lemmy. Why would anyone buy a Reddit account anyways? To appear trustworthy, right? Because karma points and account age has been used as a proxy measure for an account’s trustworthiness. An old account, with a fair amount of karma points is a valuable thing for misinformation and/or campaigns. Even if I hate Reddit to the point of wanting to see them literally burn, I still can’t support misinformation and advertisements.
Another thing to consider is giving up control of your account. I’m one of those who scrubbed and deleted my Reddit posts and comments. Selling my accounts would mean that if Reddit undeletes my posts and comments, I no longer can do anything about it. Worse, someone else is in control. That’s no good for me.
Welp. Reddit should’ve thought about this possibility before screwing over everyone.
But your last argument is reason enough to keep your Reddit account.
I would have gone straight to that if I also ignored the ethics of it all, lol!
You make some real solid points and yes the ethics of it is what I would struggle most with. Do you still use an active Reddit account out of curiosity?
I haven’t been going there since last month (June 12). Since then, I’ve only gone to check on whether or not any posts or replies have resurfaced (undeleted by admins or otherwise).
Last night, however, I commented to inform people about the state of Lemmy apps in an effort to get one or two people on the fence to switch over.
Other than that, I no longer use my account.
I like your style. Thanks for the reply. I will still occasionally spend a little time there on my lunch breaks myself. Maybe I should try to spread he word as well.
So long as you’re aware of the dangers of spreading the word, yeah, why not?
I’ve gotten shadow-banned because of what I did, but no regrets.
Presumably because all the good usernames are already taken.
For the sake of moderation effort, that is pertinent, yes. It is true that a new account is more likely to spam than an old, active account, so some moderators will choose to only allow the latter to post to make their lives easier, and this circumvents that. But if someone is foolish enough to give their free labour for Reddit’s gain, let them work like dogs. Why should anyone else care?
Only if moderators stop doing their job. However, Reddit has made it pretty clear if moderators stop doing their job they will be kicked to the curb. It is doubtful that there is any material difference here.
Perhaps you are implying that the Reddit crowd doesn’t have the wherewithal to avoid the fallacy of authority, but if you truly think they can’t even work through a simple logic exercise, there is no hope no matter how hard you try to babysit them. At that point they are already certifiably braindead. Let them go.
Good point regarding usernames. I’ve forgotten about that.
I hate to say this, but are you willing to extend this reasoning to moderators here in Lemmy? They’re also giving their free labor here, right? Or is it different because “no corporation gains” here?
I’ve never been a moderator over there, nor here, but I’ve gotten a close look at what it entails more than a handful of years ago, in a relatively small forum. The most surprising thing for me is that even there, content moderation already is a headache, and a thankless one at that. Whatever that could make that job a little bit easier will be used. Karma points and account age are, as I’ve pointed out earlier, used as a tool to make one’s job as a moderator easier. Do I think it’s warranted? No, personally, no. The best way to deal with each report, each person, is on a case-to-case basis. But when you’re swamped with a lot of reports and untrustworthy people (to the point that the starting point is any one person being untrustworthy), I would understand why anyone would take a little shortcut here and there.
My argument against selling accounts is never really focused on Reddit, but I called it out by name because of its size and influence.
For a bit of a background, I am Filipino, living in the Philippines which has become a testing ground for the weapons of mass disinformation during the last few national elections. The report Architects of Networked Disinformation focused on Facebook, which is where most of the troll farms have operated on. However, I have no reason to believe that the same techniques can’t be applied to places such as Reddit (or even here in Lemmy). Furthermore, while focus during the 2022 national elections have also been on Facebook, there is reason to believe that some activity has also happened elsewhere, particularly Reddit where supporters of Leni Robredo (who ended up second in the race) have congregated to avoid the toxicity that is Facebook.
So, does mean I’m already a veteran when it comes to facing disinformation? No, hell no. However, my own personal experience, albeit colored through personal biases, has at the very least given me at least one insight: misinformation is insidious and isn’t as obvious as one might mock it as. It isn’t as simple as “Vote for XYZ, because he will bring us to a new golden age, make ABC great again!” It isn’t always, in your own words: “a simple logic exercise.” While it’s easy to point out and laugh at the failures, it’s just that: failures. To quote from a section that highlights the experiences of community-level fake account operators:
Yes, of course. It’s considered a failure to be outed as a fake account operator. It would be “a simple logic exercise”, to extend this reasoning to why a fake account operator would be a lot more careful and be more subtle in their operations, perhaps even utilizing the best (and worst) of human psychology and sociology to use our very own intellect, feelings, and connections against us. This exercise is left for the reader.
Now, all of the wall of text you’ve hopefully endured thus far, leads to one point, which I’ve already alluded to in my earlier reply: Selling accounts feeds the misinformation machine. I cannot, in good conscience, allow myself to feed with my own hand this very machine that has done a lot of damage to countries all over.
My apologies if I have been combative in my tone, but I hope you understand that it’s coming from a place of … trauma, I guess?
Sure. Although presumably they also own the instance, so it is free labour like clearing your own laneway of snow is free labour, being the benefactor of the capital ownership. That is different to volunteering to clear the snow in the Walmart parking lot.
Only if the moderators can’t keep up due to the aforementioned volume problem. The “trustworthiness” of the account has no bearing on the “trustworthiness” of the message that gets delivered. A professional surgeon exclaiming that the world is flat wouldn’t make it any more believable than a homeless person standing on the street corner. The person delivering the message is irrelevant, hence why it is a logical fallacy to try and ascribe value to a person’s character in this regard.
No need to apologize. Tone has no bearing on the content you are delivering. If I cannot separate the two, I have failed as a human and can be lumped among the braindead.
Thank you! It’s really been a breath of fresh air to be able to assume that the person I’m talking to is arguing in good faith.
Now, to address your points.
As far as I’ve understood Lemmy and how federation works, moderators and admins are two separate things, with their own separate responsibilities (and hopefully) with their own separate (albeit overlapping) mechanisms and toolsets for content moderation. But that’s mostly besides your point which is “it is free labour like clearing your own laneway of snow is free labour.” I think we’re on the same page there. The dynamics are a little bit different here, but I would argue that the challenges are still similar enough.
Another point that I would like to briefly mention here is the incentives one might have to moderate a community here, or in Reddit. It’s the desire to cultivate a community of like-minded people.
For all the faults of Reddit has, I think its greatest edge (and greatest weakness, in other respects, in my opinion) is that it provides a ‘one-stop-shop’ for people to find a community of like-minded people. That by funneling a huge amount of people into one place, it makes it easier to find other people interested in, for example: prostitution in late medieval Constantinople (a facetious example, but I hope that illustrates my point well enough). It’s easier to find people to establish a community for really niche subjects when you’ve a lot of people in one place. This also increases the desire to nurture that community, even to the point of withstanding abuse, not just from users, but also from admins and whoever else.
A point that I have probably not addressed well enough in my previous reply is the challenges of content moderation everywhere (from small forums, to Reddit, to Lemmy). I hope you’ll agree with me that it is a huge challenge especially in places with a large amount of activity. Sifting through each report, making sure that you’re making the best judgement call for each case, is not an easy task.
This brings me to your next point, which I admit I’ve sidestepped a bit: “The “trustworthiness” of the account has no bearing on the “trustworthiness” of the message.” I agree, in principle, but in practice? I doubt it. A good portion of the report I’ve linked earlier has described the amount of work that goes into making a fake account trustworthy. Even if we assume that moderators scrutinize every report, checking the account’s activity history, and even (in the case of FB), going through the trouble of cross-checking friends and mutual friends, it would be a difficult job separating fake accounts, from legitimate, albeit similar-looking ones. Furthermore, the time and effort that would go into this would, assuming limited time and resources—especially if we’re talking about volunteer community moderators, detract from other moderation tasks that need attention as well. Given a glut of accounts for sale, it would be far easier for troll farm operators to just buy more accounts, disposing of “exposed ones” as they go.
I do hope I’ve addressed your points satisfactorily. It’s been a pleasure discussing this with you thus far.
Well, I don’t know if I’d go that far. By common definition, argument implies that one wants to persuade another to share in their views. Truthfully, I have no care as to what kind of views you hold. There is no value proposition in trying to persuade you. That is not the value of discussion.
Yes, it is true that you can metaphorically “shovel the parking lot at Walmart” on Lemmy, but my assumption was that anyone who has an interest in moderating would also take ownership of an instance. That said, it seems we are moving past the question of: Who cares if they have to work like a dog? One does not become a volunteer moderator if they don’t like the work. Perhaps providing more moderator duties is actually providing them a service?
I took that you already suggested that some people just aren’t capable of understanding the seperation, and I acknowledged that as being a real, possible scenario. But I also questioned how far you can really go by babysitting them. At some point you have to accept the lost cause, no?
If the likes of you and I are doing our jobs properly, our world should be robust enough to handle those with such disabilities and other life challenges. Are you suggesting that you and I have failed?
In most cases, yeah, I’d agree. In this particular case, it can be argued one way or the other. Someone can see us and say “megane-kun is trying to convince me not to sell my account,” for example. Personally, I agree with you on that one point in this instance. I think we’re just two people exchanging views on a certain question. But in certain cases, where persuasion leads to action, it’s different. And this is actually pretty much intertwined to the topic at hand. It is this potential for action that makes persuasion valuable in certain cases.
I’m arguing that, no, in certain instances, and in the cases that are most relevant to the question of selling one’s account, that there are people at work in making this separation really hard. That the job of a troll farm, the most likely buyer of one’s account, is to make that distinction be very hard for community moderators and ordinary people alike (even more so for the latter), and the effort to make the distinction incredibly hard as to give up on it, or make sloppy shortcuts (account age, karma, etc). Why do all this? To influence others to do certain things, be it in politics (voting for one person over the other), or in advertising (buying one product that you don’t need).
Personally, however, I agree. Some people can’t just be helped, and that it’s unfortunate that this kind of people could be a majority. However, since I know with certain confidence, that there are bad actors at play that working overtime such that the likes of us would fail, always, I have to just be a little more understanding, and be aware that I myself can fall victim to such machinations.
I suppose the crux of this discussion is this: I am arguing that selling my account helps the disinformation machine, you’re arguing that it doesn’t matter, that the responsibility lies in the person imbibing information.
If that’s the case (and that I’m understanding you properly), well, personally speaking, I see no reason why it can’t be both: be responsible in imbibing information, but be aware that there are bad actors at play (and don’t lend these bad actors a hand).
Wouldn’t you say that writing comments on forums is a solitary activity? I mean, there is good reason to believe you exist out there as another person, but is that not just an implementation detail of the software? Would it make any difference if the software quietly replaced you with something akin to ChatGPT? The answer is no. If done well, I’d never notice. The value is not in the exchange with another person either.
Don’t get me wrong. I do believe there can be value in exchange with other people. But when one seeks that they go outside. This environment is quite different; it is very much designed around the individual.
I didn’t mean to suggest that argument cannot happen, but I don’t see the value in it. I don’t find enjoyment in changing random interest stranger’s (or ChatGPT model’s) minds. And when I’m spending my precious free time, it had better be enjoyably spent. If they are “wrong”, that’s their problem. One needs to take some relaxing downtime just for themselves now and again.
Well, I suggest that the responsibility lies on us to create a world where people struggling with misinformation is not an impediment. I get the appeal of trying to sweep them under the rug. It unquestionably makes things easier, like not needing to accommodate those in wheelchairs makes things easier. But must we take the easy road?
My argument has always rested on the question of why someone might buy an account. Whatever it is that we enjoy online, in places such as Lemmy, Reddit, or whatever, is besides the point.
I’d say it’s not even a problem of someone sweeping things under the rug, but an intruder throwing dust and trash all around.
I think I’ve already said my piece here, and as you’ve said: “Truthfully, I have no care as to what kind of views you hold. There is no value proposition in trying to persuade you. That is not the value of discussion.”