57 comments

  • otterley 1125 days ago
    Recently Apple added a feature to iOS that allows you only to allow selected photos to be accessible by an app. This allows the user to respond positively to an access request, but allow the app to see only a subset (or zero) actual photos.

    It would be a very useful feature for Apple to do the same for contacts: the app would think it's getting access to your contacts, but would only actually receive a subset of them, and be none the wiser. This would be a tremendous boon for privacy.

    • rsync 1125 days ago
      "Recently Apple added a feature to iOS that allows you only to allow selected photos to be accessible by an app."

      What we really need to see from Apple is a permissions index in the app store that allows me to inspect, and consider, the permissions that an app will request before installing that app.

      I shouldn't have to install the app (or do laborious research online) to discover what permissions it will attempt to utilize and which of them are required to function.

      It would be trivially easy to list that in the app store, for each app.

      • djrogers 1125 days ago
        > and which of them are required to function.

        On the iOS App Store, none of the optional permissions can be required for an app to perform it's basic functions - that's a store policy, and it's generally well enforced. Obviously if your app's function is mapping, GPS can be required to use those features (but only at the user's discretion - ie while running or all the time, granular or coarse), but the app can't just refuse to launch without it.

        • DanAtC 1125 days ago
          Tell that to Citizen which refuses to operate without location enabled, and even worse, refuses to operate with coarse location. And being a free app there's no place on Apple's site to report this bad behavior.
          • 3825 1125 days ago
            To those who like me are not in the know, I think the parent comment is about an app that used to be called vigilante

            https://en.wikipedia.org/wiki/Citizen_(app)

            > Citizen is a mobile app that sends users location-based safety alerts in real time.[1][2][3][4] It allows users to read updates about ongoing reports, broadcast live video, and leave comments.[1][2] The app uses radio antennas installed in major cities to monitor 911 communications,[5] with employees filtering the audio to generate alerts.[5] In March 2020, Citizen added the COVID-19 digital contact tracer SafePass. The app is currently available for iOS and Android devices[6] in 20 cities,[7] including New York City, the San Francisco Bay Area, Baltimore, Los Angeles,[8] Philadelphia.[9] Detroit,[10] Indianapolis,[11] Phoenix,[12][13] Cincinnati,[14] Chicago, Minneapolis, Saint Paul, and Cleveland.[15]

          • diebeforei485 1124 days ago
            Yes, they are extremely aggressive. Also, their payment screen about "start for free" leads to a $199 payment after a short two-week trial.
      • lanstin 1125 days ago
        All these permission choices should be invisible to the app. If I say no contacts the call should succeed but with a zero Len response. It shouldn’t be possible for apps to say you have to agree to this or I won’t run. I can run the software and as the root user control what data the software can use.
        • djrogers 1125 days ago
          > It shouldn’t be possible for apps to say you have to agree to this or I won’t run.

          It's not - that's a violation of the App Store TOS. That's also not what's happening here - you can use clubhouse without allowing contacts access, but you can't invite someone to the closed beta without allowing it.

          • danShumway 1125 days ago
            GP means that it shouldn't be technologically possible, not just that it shouldn't be possible as a matter of policy.

            The policy solution clearly doesn't work in all scenarios because Clubhouse is still on the store. But an on-they-fly permission model that allowed the user to deny the permission invisibly or share a subset of their contacts would completely solve the problem regardless of whether or not Apple was effective at moderating.

            Apple could still do whatever moderation they wanted to reduce annoyances for the end user, but the sandboxing approach would catch any apps they missed or refused to moderate.

            This would also solve the problem where an app legitimately needs some access to contacts to run, but doesn't need access to the entire list. Clubhouse does need access to some contacts to invite someone to the beta, but it does not need access to the entire contacts list, and there's no reason for it to have the ability to tell whether or not a user is providing the full list.

          • chii 1124 days ago
            > that's a violation of the App Store TOS.

            not if the app still functions with deliberately reduced functionality. What i want to have is for the app to be unable to tell the difference between being denied permission, and having no data (or be sent fake data).

          • lanstin 1125 days ago
            They must know that I have disallowed access in that case.
        • dheera 1125 days ago
          > If I say no contacts the call should succeed but with a zero Len response.

          Actually I would take it further and say that I should be able to define its response or have it render a random but plausible template response. Otherwise a zero len response is too obvious that you didn't give it permissions.

          I once had an app yell at me for not giving my GPS permission, but then yell at me again when I enabled a mock GPS on Android. It really shouldn't have been able to know I was mocking location.

        • lanstin 1125 days ago
          Or even as a a service fake data - feed fake location data and fake contact list. Full of 202-555-1234 type numbers. I always put fake data into web forms and it is a sign that I don’t truly own the phone that I can’t do the same for local software.
          • lanstin 1125 days ago
            Like I want a pop up: this application is requesting your location data. Shall we give the real data, no data, or simulated data. Same for contacts, photos, apps installed, etc.? Not saying that would solve all the problems but it would be user centric in a way the privacy conversation just isn’t.
            • Viliam1234 1124 days ago
              > Shall we give the real data, no data, or simulated data.

              Or even a selected subset of real+fake data. (With the ability to define sets that could be reused across application.)

              Sometimes I want the app to have access to some of my contacts, but not all of them. For example, my work contacts but not my personal friends, or vice versa. Or simply "my contacts that told me they were also using this app". PLUS hundred fake contacts to poison the data.

            • simonh 1124 days ago
              Giving fake location data could create real problems. Suppose you do this then forget. If it’s a safety or navigation app, you tell the phone to give it fake data, then you forget and maybe use the app much later. Now you’re using a service that thinks you’re in a different location.

              One of the examples given here was an app that gives you safety alerts. A navigation app might give you useless directions. There are a thousand ways this could go horribly wrong.

              I suppose iOS could present some warning, but that might interfere with the UI, or be misunderstood.

        • Larrikin 1122 days ago
          This works fine for contacts, but what should happen when I deny the microphone permission? Should I be able to send Shazam a monotone beep or even worse a sample of random sounds that can't even be filtered out?
      • andai 1125 days ago
        I didn't realize iOS doesn't have that. Google Play shows each app's permissions on the listings page.
      • aeternum 1125 days ago
        I'm not sure the permission index would be very useful.

        Most iPhone chat apps for example work perfectly fine with zero permissions granted yet provide the option to send pictures, invite contacts, use mic/camera, send gps location, etc if a user is so inclined. With a permissions index, you would likely end up with the majority of apps listing all permissions and users would simply ignore it.

        • NeutronStar 1125 days ago
          So? Just give me the possibility to see it.
      • l8rpeace 1125 days ago
        +1 and a filter you can use on related permissions when searching for apps
      • behnamoh 1125 days ago
        They have added that, but it's written by the app developers so you still can't trust what they claim they're gathering from you.
        • simonh 1124 days ago
          I think Apples app reviewers have tools to analyse what APIs and permissions an app tries to access to check this.
    • NathanielK 1124 days ago
      Under Android, if apps want to be respectful, they can. For example, the Discord Android will fall back to system file-pickers if you block storage permissions. Since there's no incentive to be privacy respecting, very few other apps I've used let you do this.
  • crazygringo 1125 days ago
    I don't see what the point is.

    "Data poisoning" gives companies a bunch of fake contacts... on top of all your real ones?

    Who cares? So they send some e-mails to addresses that don't exist or something? So it takes up an extra 1% of disk space in their database?

    If you could share an empty address book then that would actually preserve the privacy of your contacts. But this doesn't do that.

    I don't get it.

    • ultimape 1125 days ago
      Bad data makes it less valuable for resale. It's an attack on the market that these things operate under.

      Can also be used as a canary trap.

      • alsetmusic 1124 days ago
        > Can also be used as a canary trap.

        Can you please explain how this can operate as a canary?

        Edit: another post explains that the method is if the bogus data end up an a data leak, but that would require keeping track of bogus submissions and generating new data for each company where you create an account. Then you’d have to cross reference like crazy. Am I missing something simpler?

        • itronitron 1124 days ago
          I know at least one person who has their own personal family domain set up so that his family members can just create new email addresses specific to the vendor when shopping online (for example, 'amazon@familyrobinson.com' and 'bestbuy@familyrobinson.com' ). Then all their shopping emails just get routed to his domain. Being able to track which company leaked or sold an email address seems like another benefit in addition to catching all the marketing emails.
          • armedpacifist 1124 days ago
            Gmail has this feature baked in. Append a + sign to the username and then append any string you want, ie. username+ycombinator@gmail.com. It will forward these mails to your regular email address. I started doing this for the exact same reason as mentioned above, but you can obviously do more than just creating honeypots. Also you have to ignore the fact that it's Google...
            • zaik 1124 days ago
              There already exists at least one popular js validation framework which removes Gmail (and others) subaddresses per default in its "normalizeEmail" method: https://github.com/validatorjs/validator.js/blob/master/src/...
            • austinjp 1124 days ago
              I do this. A small annoyance can crop up when trying to log in to a service with your + modified email address. Hmmm, what did I append after the plus? If you can't remember that, you can't use the "forgot my password" function either :)

              Usually soluble, but irritating.

        • ultimape 1124 days ago
          This is basically a mechanism that would allow for a honeytrap on steroids. https://blog.finjan.com/honeytokens-used-to-track-cybercrimi...

          > that would require keeping track of bogus submissions

          You don't have to keep track of the submissions because you can generate them with a reversible algorithm. Basically use a word list method https://github.com/bitcoin/bips/blob/master/bip-0039.mediawi... but have the list be entirely people's names (or generated from a corpus of known accounts and something like https://github.com/minimaxir/textgenrnn to make them harder to spot)

          > generating new data for each company where you create an account.

          yes https://arxiv.org/abs/2006.15794

          Basically treating the data from various email honeypots as a "Numbers station" but instead of using it to prime encryption keys, you use it as a form of steganography. To do this entirely anonymously, the next step would be to publish on a public blockchain or anonymous service so that the owner's device (that generated the emails originally) can uploaded a signed statement that proves they were the phone pwned and who the offending app was.

          A similar idea seems baked into a couple of crypto initiatives https://coincentral.com/sentinel-protocol/ but fundamentally we're talking about an anonymous reputation system modeled after how swarms operate to gossip risk.

          It would be necessarily stochastic in nature because you'd be depending on the 3rd parties to send emails a bit at a time, but if you get a deluge of phones all reporting the same app, you can assume fairly confidently that app has been compromised. Punishment (Brand reputation, sanctions by app store) for being compromised would encourage better security.

          This could (and would need to be) operated at the hardware level and orchestrated by the OS, and OS provider. This is the kind of thing apple and google could do as part of their privacy initiatives around "differential privacy" https://venturebeat.com/2019/12/21/ai-has-a-privacy-problem-...

          What, you think that deep learning chip on your phone is there to make cute avatars?

      • stjohnswarts 1120 days ago
        Well since it uses names (last and first) that all start with Z most of these crooked outfits could survive losing 0.5% of the "real" names off their list by filtering the Z. Z. names.
    • shervinafshar 1125 days ago
      Not an expert on guerilla cyber-warfare, but isn't it the whole point of this sort of poisoning? If enough people do this the cost of those bouncing emails would become prohibitive. That's my speculation. Would be great to know more from someone who knows the domain better.
      • geoduck14 1125 days ago
        Oh! I know! I work for a large company. At one point, we sent so much junk mail, we were the Post Office's #1 customer (in the US). We have started to send junk emails, too.

        There is a Swedish company (non profit? Activist group? IDK) called SpamHause. They partner with ISPs to help block spam. Their process is something along the lines of:

        1) Inject fake email addresses into lists of email addresses that are bought/sold WITHOUT user consent

        2) Wait until someone UNAUTHORIZED emails them spam

        3) Tell the ISPs (and anyone else who will listen) to STOP processing emails from the companies that sent them spam.

        4) The ISPs block the companies because SpamHause is reputable and REALLY good at finding spam, also the ISPs save money by "not having to process the spam

        We accidentally got a hold of a bad batch of email addresses several years ago and we spent MONTHS trying to fishout and overhaul our email authorization process. It cost us $10s of millions.

        Also, WHY were we sending that spam?!?

      • remram 1125 days ago
        Even if you make your contact list 99% bounces and 1% real (and every user of the app does the same), I don't see how this becomes a problem for the app's operator. Remove a contact after 1-2 bounces and you're golden.
        • shervinafshar 1125 days ago
          Fair. Still golden if this needs to be done for all contacts of all users?
          • remram 1125 days ago
            If they bounce they are extremely fast to cull.
        • michaelcampbell 1125 days ago
          Who bounces any more?
          • U8dcN7vx 1125 days ago
            Everyone except nutters. Google bounces. Microsoft too. I don't know of any provider that doesn't. Surely someone somewhere does not bounce but they are in the minority.
      • jmatthews 1125 days ago
        You don't bounce emails you prebounce them and clean up your list. This is part of any sensible data engineers process.
        • jmatthews 1125 days ago
          More helpfully, salting their db with real emails but fake contact info requires a more durable hygiene process and often isn't worth the effort for data driven shops.

          You serve a variety of email domains that validate as deliverable, then you accept emails and report the sender, which hurts their deliverability.

        • TedDoesntTalk 1125 days ago
          what is prebounce?
          • shervinafshar 1125 days ago
            My guess was they are referring to one of these services that check the validity of any email address. A false signal from one of these services prevented me from signing up for some random website with a .name domain the other day.
          • tyingq 1125 days ago
            Email them from a throwaway domain and ip, toss out hard bounces from the list, so you don't poison your SMTP reputation?
      • dogman144 1125 days ago
        Pretty nifty side point:

        > If enough people do this the cost of those bouncing emails would become prohibitive.

        This idea got a ton of attention in early days tech that led to what's known as proof of work: see bitcoin. The primitives of btc show up in a lot of interesting areas.

    • cyral 1125 days ago
      It will be interesting to see if these fake contacts show up in a leak somewhere someday. Almost like how people do myname+yourcompany@gmail.com, we could create similarly fake contacts to see who is selling or leaking data.
      • shervinafshar 1125 days ago
        *@myname.name
        • _0ffh 1125 days ago
          Yeah I've been using yourcompany@mydomain.tld for ages to track who's sold or fumbled my data. Since haveibeenpwned I can even approximately separate the two groups. Surprisingly up to now nearly all incidents (that I know of) have been breaches. Not that it makes it any better.
          • atleta 1124 days ago
            Same here. Though people usually give me weird looks when I do this IRL and ask me "Is your email address really ourcompany@yourdomain.tld"?
            • aylons 1124 days ago
              I had these weird looks, and I even got engaged in a nice conversation with an employee at a t-mobile store, but some people don't even bat an eye.
    • blablabla123 1124 days ago
      I'm surprised it has such a harsh name. Years ago I was wondering what would happen if people just came up with random data, e.g. derived from the own personal data, thinking about crawlers, automatized personal data processing. But I'd just call it creating garbage data because that's what it is. Eventually it will be impossible for an algorithm to distinguish between real and garbage data. (And probably not only for an algorithm)

      But I agree with you, it's probably the wrong approach. Personally I've deleted quite a lot of accounts/uninstalled bloated apps. In addition I use tools that actually set additional boundaries, but I'd prefer if the apps wouldn't be so data hungry in the first place.

      • m463 1124 days ago
        I think the trick is that different users might create "identical garbage" so that contacts match on the backend.
        • blablabla123 1124 days ago
          Hm ok, that's a nice trick though
    • loveistheanswer 1125 days ago
      The vast majority of phone calls I receive are spam calls by people/robocallers which I did not give my phone number to, but apparently someone else did. I don't want people sharing my phone number with random other people
      • crazygringo 1125 days ago
        Nobody had to give them your phone number.

        They just dial numbers at random. Phone numbers aren't sparsely distributed. There are entire area codes that are essentially fully utilized.

      • stjohnswarts 1120 days ago
        I thought that the new challenge-response systems between providers (shaken-stir?) was supposed to take 99.9% of this. I guess I'm off to see if they have actually fully implemented it between providers.

        Edit: looks like summer is the mandated time. From: wikipedia article on Stir/Shaken As of 2019, SHAKEN/STIR is a major ongoing effort in the United States, which is suffering an "epidemic" of robocalls.[1] Both the Canadian Radio-television and Telecommunications Commission and the Federal Communications Commission are requiring use of the protocols by June 30, 2021.[2][3]

    • Triv888 1125 days ago
      In addition to that, you probably don't get into contact with fake contacts very often... it is just a smoke mirror to make the users think that they have privacy...
    • anotherfish 1124 days ago
      Better to use canary email addresses that actually go somewhere you can detect incoming emails. Then it would be useful.
  • bredren 1125 days ago
    Clubhouse requires contact list in order to get invites, which are required to sign up right now.

    I get why they are doing this, and it caused me to share my contacts with them.

    However, I resented it and it put me immediately in a defensive posture with the product and company.

    There is no possible way to trust a company with your contact list and Apple should make it how Photos works now--where you can select which data to share. There are some folks I don't even want to possibly find in a social app.

    • post_break 1125 days ago
      I mean this is why they do it. You knew it was wrong, you knew they were going to take that data and mine it, and you still said sure.
      • jancsika 1125 days ago
        Are you writing that to emphasize the urgency for the government to pass legislation to reign in unregulated online casinos as they continue refining their dark patterns? (I.e., without legislation, these companies will continue finding more and more sophisticated ways to get the user to act against their own interest.)

        Or do you mean imply that a practical approach to reign in unregulated online casinos is to spread the message of, "Just Say No," in web forum comments to the ostensible addicts?

        Or to be fair, something else entirely? My point is I can't tell without context there whether you are sympathizing with the user ("ah yes, something needs to be done because they've found your weak spot"), or chastising them for not having the force of will to resist dark patterns.

        Edit: clarification

        • DevKoala 1125 days ago
          Not the poster you are replying to, but I stopped feeling empathy for people who complaint about lack of privacy, yet willingly give up their data to non-essential services that ask for it with all the proper disclosures.

          If you agreed to sharing all your contacts to listen to “musical tweets”, I don’t see why you’ll be complaining. You willingly made a trade off.

          • bonoboTP 1125 days ago
            Social status is a hell of a drug. Clubhouse is a place where people like Elon Musk and famous successful scientists and businesspeople hang out so all the hustler startup get-rich people want to be on board. It's exclusive, it's just for fancy iPhone users. Finally an elite place where you can only get in by invite, most cannot resist. If they miss out on the bandwagon, how can they call themselves an early adopter on the bleeding edge? What will their friends thi k of them? Almost as if they used Android or something.
          • jimkleiber 1124 days ago
            I wish it were more clear what exact info we were giving to the app (not just generic "contacts" or "photos") and when the app is receiving that info.

            I know more about coding than 95% of my friends and I still don't fully understand the depth of info that I transmit to an app when I agree to give it permissions on my iOS device.

            E.g., if I give Whatsapp access to my photos on my iPhone, does that mean all of the photos that are stored on my iPhone, including screenshots and hidden photos, are uploaded to Whatsapp servers? Does it upload when I take a new photo or when I open the Whatsapp app?

            So if, in this case, Whatsapp is indeed pulling all of a user's photos, including hidden photos, to their servers, I imagine many people would not want that to happen. So 1) I'd want to know ahead of time exactly what will be pulled and 2) ideally, I could have a way to use the app without giving it the keys to everything.

          • WA 1125 days ago
            ... willingly give up other peoples‘ data.
            • DevKoala 1124 days ago
              Exactly. What a jerk right?

              “Oh noes I had to give up all of my contact’s personal information... but I got into that beta!!”

      • ganstyles 1125 days ago
        Correct. I've been a member for going on a year now and I have scores of invites I don't appear to be able to send because I won't share my contacts. Not that I care enough to invite people, but it's a dark pattern to even require it.

        I have heard there's a way to share invites without sharing contacts, but I haven't cared enough to even do a cursory search on that.

        • chipsa 1125 days ago
          unsync your contacts from whatever service provider you're using, make sure they're gone, go ahead and share the contacts (which are now empty) with Clubhouse, get the invites, then revert everything back?
      • tonylemesmer 1125 days ago
        That means that more than likely clubhouse have our details even if we have no desire to be part of it.
        • srockets 1125 days ago
          It’d be fun once they’ll have EU presence.
          • jimkleiber 1124 days ago
            I think they had a wave of people join from Germany either earlier this month or last month, so I imagine there are already plenty of Europeans on the app. Plus, doesn't GDPR apply even if there's just one user who resides in the EU?
            • srockets 1122 days ago
              Having European data subjects is enough ground to ask them to abide by the GDPR. But assume they won't, then you'll have to go to a European court, which could rule whatever, but it can't do much to a company that has no money or persons in the EU to collect from.
              • jimkleiber 1118 days ago
                Yup, I've been wondering just how inept national and local government seem to be in regulating a global internet. Seems that if we want to have global internet, we eventually might have to come up with a more coherent form of global governance.
      • arkitaip 1125 days ago
        Fomo is a helluva drug.
      • toss1 1125 days ago
        And this tells me that there is a need for another step up for this app - to not only poison the contacts, but to temporarily 1) backup => 2) delete => 2a) share poisoned list => 3) restore contacts.

        So we can share the list, but they'll never get our real contacts, only trash data. Enough use it, maybe they'll stop

        • a3n 1125 days ago
          But wouldn't this company have to periodically review your contacts, to slurp up new ones?
          • toss1 1125 days ago
            Yup, probably their next move would be to require constant access to contacts list and check whenever the app runs.

            The next move on this side would be to keep contacts in a separate app from the std Android/Apple app, and then have to make calls, texts, etc. from there.

            If only there weren't so many sociopaths running these companies... sorry, wrong planet

      • kzrdude 1125 days ago
        They do it because all the successful social apps need to make contact discovery easy. The ones that don't use this trick - ethical - we don't hear so much about, maybe they don't succeed.
        • goatinaboat 1124 days ago
          They do it because all the successful social apps need to make contact discovery easy.

          Signal does it with hashes which it doesn’t store anyway

          https://support.signal.org/hc/en-us/articles/360007061452-Do...

        • forgotmypw17 1125 days ago
          There are quite a few that have not done it. I don't think it's necessary for success at all.

          HN seems to be doing pretty well, and it's never done this sort of thing, as far as I know.

          Reddit never did it during their growth phase, instead they provided their own seed content.

          Metafilter has never done anything unethical to my knowledge.

          There are many, many successful social networks which have not performed unethical contact harvesting and other shady things.

          • skinnymuch 1125 days ago
            Clubhouse raised money at a billion dollar valuation. Hacker News specifically and Metafilter aren’t in the same stratosphere
            • lupire 1125 days ago
              But still, why do they need to steal your addressbook? They can offer you to spam your contacts without demanding. Is the profit contingent on selling the address book data? To the point where they won't let you invite more people (help them grow!) without it?
              • skinnymuch 1124 days ago
                The pushback is minimal. A lot of the pushback possibly includes people that are going to be upset by many things. Specific Reddit communities and Hacker News are good examples of that. If these demographics are unlikely to be happy with your social product’s privacy and dark or non dark patterns, catering to them makes no sense.

                I don’t know any one outside some geeky sites and only one person personally who cares about any of this. Some do say lame casually. But it’s not going to be a deciding factor for using the app.

                To add on to the whims of the geeky communities. Some companies escape it more than others. Airbnb doesn’t get much shit for spamming Craigslist people in early days. Compared to the negative talk of Uber, Facebook, etc, they also get no where near any criticism for the way they incentivize negative aspects of their platform.

                All of this to say - there’s no real downside if money and power is the primary goal.

                • forgotmypw17 1124 days ago
                  >I don’t know any one outside some geeky sites and only one person personally who cares about any of this. Some do say lame casually. But it’s not going to be a deciding factor for using the app.

                  My experience is completely different from yours. Out of the dozens of people I've spoken with about this stuff, I can't remember a SINGLE PERSON who didn't express dissatisfaction with at least one of: lack of privacy and potential willy-nilly snooping by CompanyX employee; arbitrary blocking and post removal without good cause; bad interface design; low quality of content.

                  I don't go fishing for it either, it just happens in conversation, although I sometimes am the first to broach the subject of social networks.

                  • skinnymuch 1124 days ago
                    Sorry, I was exaggerating and didn't make it clear what I meant. Yes people care. A lot of people will especially say something or another. Most won't actually not connect to Clubhouse with their contacts though.

                    So I see the same dissatisfaction. I find that to be closer to slacktivism level of caring for most people though. If they stop using an app or don't use it from the get go for X reason, it usually doesn't align with how they're using other apps. So technically they did not use something because of the dissatisfaction you described. However, they aren't actually applying that even remotely consistently.

                    • forgotmypw17 1124 days ago
                      I think it is just habit and culture, and both can change very quickly. There are many examples throughout history, e.g. civil rights movement.
            • forgotmypw17 1125 days ago
              What are you trying to say?

              That because they have a lot of VC money riding on it, they have to do "growth hacking" in order to justify the funds and grow quickly enough to satisfy the investors?

              Well, I guess I have to agree.

              • skinnymuch 1124 days ago
                I assumed the OP saying successful social apps as in successful to the point of being known by at least some average people. Metafilter and Hacker News are both very niche and tiny.

                Hacker News doesn’t have the same business model as others either. It’s to help the namesake incubator. It succeeds with that. Getting contacts etc wouldn’t benefit Hacker News. Hacker News could lose a decent amount of money yearly without any prospect of breaking even and still be run.

                Most other apps of any kind can’t be run that way, including Reddit and Metafilter.

                • forgotmypw17 1124 days ago
                  Perhaps it is just a fact that we have to accept that large (1M+? 10M+? 100M+?) social networks cannot remain sustainable without abusing their users. That would mean we can benefit from building smaller, sustainable communities for ourselves and those we care about. I'm surprised it's not happening already, to be honest.

                  With today's technology, you can spin up a community website for, e.g. your family or your organization for the price of basic Web hosting and have all the perks of connecting without the downsides of e.g. your data being harvested and reviewed by anyone at CompanyX.

                  Sure, you have to do your own security, but the big social networks aren't impervious either. And you gain the advantage of not having your account randomly disabled or spamfiltered or shadowbanned.

                  It won't protect you from NSA or FBI, but I don't think most people care about that. On the other hand, people I've spoken with are aware and do care about snooping by CompanyX employees.

                  The more I think about it, even as writing this comment, the more I can see that we are very close to rapid disruption in the social network space.

                  • skinnymuch 1124 days ago
                    Perhaps it is my family being minority immigrants with other caveats. Snooping by employees is rarely done. It most likely isn't going to mess you up too badly either.

                    On the other hand, the number of people I know who are either deathly afraid or just normally afraid of the government and its agencies like the ones you mentioned is high. This is definitely going to be a minority of people. My family and extended family aren't common cases.

                    I don't see either as big deals though. Not enough to have people not stick with bigger apps that have network appeal.

                    I don't believe disruption of anything but more of the same in different clothing happening. I'm a pessimist though.

      • bogwog 1125 days ago
        In my case, I don't even remember giving them permission to use my contacts, yet I got accepted because one of my contacts sent me an invite.

        I might have given them permission without realizing it, but what could've also happened is that they saw my phone number in someone else's contact list, and assumed we were contacts.

        • evanmoran 1125 days ago
          You probably didn’t share, as I didn’t. I believe the contacts permission is only required if you want to share an invite, not to accept one.
    • _jal 1125 days ago
      Clubhouse can bite me.

      I refuse to use tooling from shitbags who try to exort me into compromising others' privacy for shiny toys.

      I know other shops do it, as if that makes it OK.

      • Aerroon 1125 days ago
        I remember signing up for Facebook back in the day. They tried to get me to share something about my email contacts list. That just made me not use Facebook instead. Unfortunately, everyone else didn't seem to have a problem with it.
        • 0x0 1125 days ago
          Facebook literally had a box on their web site asking for your email address and email account password, so they could log in to your webmail and scrape your contacts.
        • bonoboTP 1125 days ago
          Normal people value their social standing and their relationships, bragging rights etc. higher than abstract principles. It's only loners who will resist. Popular people will be on board because they manage their brand and image instinctively. Wannabe popular too.
      • f430 1125 days ago
        Server is in the People's Republic of China to boot. But I know we have many wumaos and apologists here on HN because they tasted blood money.
    • antipaul 1125 days ago
      App Store guidelines forbid using the Contacts for anything except the intended purpose: https://appleinsider.com/articles/18/06/12/apple-disallows-d...

      Do we give CH the benefit of the doubt =p ?

      In any case, I also hope (and expect) Apple to implement better controls for sharing contacts.

      EDIT: Typo

      • koboll 1125 days ago
        Huh, so Clubhouse is explicitly breaking Apple's rules.

        Surely Apple knows this, but is allowing it because... it's mega-popular?

        What's the point of having rules if clawing your way to popularity by leveraging their violation is deemed permissible?

        • jtsiskin 1125 days ago
          How are they breaking the rules? It seems like they are using it for the same purpose they prompt permissions for.
          • csommers 1125 days ago
            They create ghost profiles for those contacts, just in case that user ever signs up. That’s fucking garbage and they should be ashamed of doing that, let alone immediately removed from the App Store.
            • pizza 1125 days ago
              This seems like the shady type of thing lawmakers should pass laws against
              • lupire 1125 days ago
                It's the sort of thing Apple should ban to protect its claim that it is more private than Android.
                • RapidFire 1125 days ago
                  It really is.

                  I keep buying into the Apple ecosystem because of their stance on user privacy. Sure, they aren't perfect; but they are miles ahead of there competitors.

              • bredren 1125 days ago
                I see this as a key problem of our times. Social convention used to have a stronger impact on behavior. Now it isn't enough for behavior to be disdained, it must be flagrantly illegal.
                • pizza 1125 days ago
                  Growth by any means necessary.. seems like there are a tens of thousands of apps that each act like their own data bureau, totalling dossiers on billions of people, just because it makes money. Maybe a few percentage points' value lost as a slap on the wrist every now and then. I feel, that in this scenario, rather than a better carrot, we need a better stick..
                • lupire 1125 days ago
                  When was the time when social convention had a stronger impact?
                  • withinboredom 1125 days ago
                    Social convention has always had a strong impact. Where I live, people will cut you in line if you leave a space big enough for them to fit and it's perfectly ok. Where I'm from, if someone did that, you'd end up with an angry mob and probably a fist in your face.

                    Social conventions are always stronger than the law; at least in person.

            • stjohnswarts 1120 days ago
              Really? Goddamn that's some dirty tactics. Guess I won't even look up what they actually do. FB already got its hooks in me when I was young and stupid, never again.
    • JMTQp8lwXL 1125 days ago
      It's disingenuous of them to say they "have to" do contact upload. Why can't I type in a phone number to invite? Completely hostile. Consequently, I have invited nobody.
      • vinay_ys 1125 days ago
        Same here. It also seems to burn through battery more quickly than other apps.
        • 177tcca 1125 days ago
          An app that recreates party lines on POTS burning through battery is unfortunately unsurprising!
    • sneak 1125 days ago
      When you leak your contacts, you harm others, not just yourself.

      This, among other reasons, is why I never give out the number of my SIM card, or my residential address, et c, to anyone. They're just going to click "allow" and give it to a thousand shady companies, starting with Facebook.

      I never give people data I don't want stored in my shadow profile.

    • satya71 1125 days ago
      Here's how to get around Clubhouse uploading contacts. We shouldn't have to do this, but here we are.

      1. Disable contacts for all your configured accounts 2. Add a dummy Gmail account, enable contacts. 3. Add invitee to dummy account 4. Give contacts access to Clubhouse 5. Send invite 6. Remove contact access 7. enable contacts disabled in 1

      • lupire 1125 days ago
        0. Don't use Clubhouse because it adds no value?
        • satya71 1125 days ago
          When you run a business, you have to go where the people are. If my customers are there, I have to be there.
          • jcims 1125 days ago
            I’d think that depends on the business. What is the engagement like on clubhouse? Do you participate or just have a presence?
          • bredren 1125 days ago
            This is the only reason I have to touch Facebook. It’s icky.
    • gherkinnn 1125 days ago
      I did the same and I’m still annoyed at myself.

      Clubhouse is pretty shit, really. So I sold my soul and got nothing in return.

      • bredren 1125 days ago
        Thanks for sharing this.

        I have similar feelings about the product, but am curious to hear your reasons in detail first if you'll share them.

        • gherkinnn 1125 days ago
          The one thing that got me interested is them using a photo as the app icon. Intriguing. Maybe there's some fun to be had. The rest was of no real interest to me. Silly, but here we are.

          Trivialities aside, the content is not for me. It's either some self-help thing or a get rich fast scheme. And I don't care about either.

          Worse though is the content delivery. They talk so much and say so little. Horrible.

          It really is this:

          > Clubhouse is C tier people listening to B tier people talk about A tier people

          And here I am, a D tier person not wanting to be part of this circlejerk.

          • bredren 1124 days ago
            Thanks for that feedback.

            I noticed the app icon as well. I wondered who it was, but didn’t look into it.

            I’m in agreement about the content.

            The “entrepreneurship” culture on Clubhouse seems to be either VC / media worship or this hustle thing.

            I joined a "Real Estate Money" group that was pitching $500 investments in shared AirBnB properties. It had calling cards of a scammy “investment” group. A mix of cheerleading and leadership-enriching sales to suckers.

            I spent some time last night in some social groups and while entertaining from a sort of shock value perspective they were not well moderated. They would not work with bigger audiences.

            Andrew Sorkin interviewed Bill Gates on Clubhouse on Friday. My eyes widened when I saw that, I questioned my initial assessment of the product and it’s velocity. Then I realized I was thinking of Aaron Sorkin. Andrew is some mainstream media journalist. I doubt Gates gave a hoot about the medium.

            I think VC, media and “influencers” all slept on TikTok and are now overcompensating with involvement on Clubhouse. It’s better suited to them anyway, and it matches the atmosphere of conferences.

            It is the type of activity PG labeled “playing house” and is perhaps well suited as a stand in for puffery, fakery and “influence.”

            Regarding content delivery, it is like you read my mind. I’d describe Clubhouse content, even purportedly serious stuff, as awash in loquaciousness word salad.

            Clubhouse content forces listeners to be beholden to linear progression. So you lost the most valuable thing about audio content found in podcasts: editing and people who have developed the skills to be engaging.

            Clubhouse lacks the crucial “trick play” of podcasts, where you can option out of minutes of content (and ads) at any time.

            So anyhow, I regret sharing the address book with them and took my lumps ITT for admitting as much.

            However, I don’t regret trying it. Critically evaluating nascent platforms and technology is what I do.

            • ayewo 1124 days ago
              > Andrew Sorkin interviewed Bill Gates on Clubhouse on Friday. My eyes widened when I saw that, I questioned my initial assessment of the product and it’s velocity. Then I realized I was thinking of Aaron Sorkin. Andrew is some mainstream media journalist. I doubt Gates gave a hoot about the medium.

              Funny, when I read that, my eyes equally widened, until your next sentence corrected the interviewer’s name in my head.

              I believe Andrew runs DealBook on the NY Times, but I’m curious how the interview was conducted?

              Bill Gates is on record some years ago, saying that “no iPhone for me” when asked if he used an iPhone. This was around the time Windows Phone lost the mobile market to iOS and Android, and since ClubHouse is iOS-only, I’m wondering how Gates was able to take part, unless he’s changed his mind since he was asked that question.

    • woadwarrior01 1125 days ago
      I have an old iPhone with an empty address book for testing dodgy apps that require contacts access, I use that for sending Clubhouse invites. OTOH, Clubhouse seem work fine on my primary phone, where I haven't given it contacts access.
      • Haemm0r 1125 days ago
        For Android I can recommend "Shelter"[1] which lets you setup a work profile, so you dont have to share your contacts, files, etc.. Downside: If you have already a work profile, it does not work (Android allows only one work profile)

        [1] https://f-droid.org/en/packages/net.typeblog.shelter/

      • lupire 1125 days ago
        If your invitees don't also have a spare iPhone, what's the point of inviting them? They'll have the same problem with no workaround?
        • woadwarrior01 1125 days ago
          You don't need to grant the Clubhouse app access to your contacts to use it. ATM, that's only needed to invite people.
    • styfle 1125 days ago
      I was kinda confused at first to see the top suggestions were all Doctor’s Offices. Then I figured it out.

      https://twitter.com/styfle/status/1358186671007760385

    • dehrmann 1125 days ago
      First I have to keep a burner number with a real sim card for things that require signup, now I have to keep a burner phone with no contacts?
    • paul7986 1125 days ago
      Would never sign up or use a service that has such an invasive requirement..I only use my google voice number for any type of public to even dating transaction. Spam and robocall that all you want which I surprising never receive/received many such calls.
    • stjohnswarts 1120 days ago
      It's not really understandable. It should be an opt in with and OR "I would like these 10 people who are important to me to be on the list you look at, not these other 400 people who I've taken the phone number of at some point"
    • lucb1e 1125 days ago
      They did something bad and yet here we are. I don't know what Clubhouse is, but I'm somewhat tempted to look it up. Marketing: successful. (I won't, in an attempt to counter that effect of growing due to negative publicity, but I find it noteworthy how well it works.)
    • the-dude 1125 days ago
      What about dividing your contacts into circles and only give permission to a specific set?
      • mandelbrotwurst 1125 days ago
        Sure, as long as it’s possible to create a circle containing only one contact, the way giving permission to access photos now works on iOS.
      • skeletonjelly 1124 days ago
        Google+ anyone?
    • crossroadsguy 1124 days ago
      Or a feature implementation which would essentially means - "select fake/random data instead".

      Fake/mock GPS (w/o telling the app that it's fake unlike what Android does), fake contacts etc.

    • MikeGale 1125 days ago
      Should you not tell your contacts that you gave their details to Clubhouse?
    • TedDoesntTalk 1125 days ago
      I mean, unless you're a newbie to the internet, how is this possible?
    • JumpCrisscross 1125 days ago
      > Clubhouse requires contact list in order to get invites, which are required to sign up right now

      How is this GDPR compliant?

      • vmception 1125 days ago
        I think this is a wording issue if you haven't used Clubhouse.

        You don't need to share contacts in order to get invited, like you don't have to do it to use the platform. You have to do it to invite others (like your friend that you told about Clubhouse) after you are already on the platform, so that is not regulated by GDPR.

        It is a shitty user experience and I also want Apple to control this at the OS level. Let me select which contacts if I want to do it at all.

      • corty 1125 days ago
        > How is this GDPR compliant?

        It isn't, really, but the question whom to prosecute is complicated. Clubhouse gets the contact list data from you, the user. Usually, somewhere in the ToS, there is a little thing where you confirm to have the right to share all the data you share with Clubhouse. That means that first and foremost, you as a user are responsible.

        If you are a non-commercial user using Clubhouse from your private phone, what you do with your private contacts isn't covered by GDPR, private stuff is an exception. However, as consumer, European legislation protects you from surprising and unusual terms, which this might be. Legislation might also protect all your contacts. However, this is a question that still needs to be litigated in court, and I don't remember any decisions around that problem (WhatsApp basically has the same constellation).

        If you are a commercial user, because this is your work phone and your contacts are colleagues, business partners, customers, things are quite different. You are, as a data processor, responsible for how you pass on your contact list. You better make sure that you are allowed to do that (because you have a GDPR-compliant reason like legal obligation, contractual obligation with your customer, assent or legitimate interest) and that your contacts have been informed about what you are doing beforehand. Also, you then need a written contract with Clubhouse about the data being passed along, about how it will be used and protected, etc. Also, passing along the contacts to Clubhouse must be necessary for a predetermined, well-defined reason that can be considered more important than your contacts' right to privacy.

        So as a private person, you might get away with using Clubhouse. As a company, employee, self-employed, state official, whatever, you are probably in hot water, because surely you didn't do all the required things. But for Clubhouse this might not be a problem, because as current case law stands (imho, iirc, ianal, ...) Clubhouse isn't the party that did something wrong there.

        • GekkePrutser 1125 days ago
          On Android if you use Work Profile your work contacts are in a separate partition and can only be accessed by approved company apps. This works really well for gdpr compliance with dual-use (company & mobile) devices.
      • pmontra 1125 days ago
        I see the point, but if I upload my contract list the non compliance is mine (I didn't ask permission to each one of my contacts) or of Clubhouse (they asked me to do it)?
        • gnud 1125 days ago
          It should be blaringly obvious to Clubhouse that they don't have the right to even store most of this data, let alone use it for anything.

          So even if you are at fault, I can't imagine that would help them a lot, if some data protection authority looked into this.

        • avereveard 1125 days ago
          both, yours for sharing, clubhouse's for storing.
      • Nextgrid 1125 days ago
        It isn't, but in addition to the (valid) arguments the other commenters make about Clubhouse not having any assets in Europe (thus making enforcement of any kind of penalty nearly impossible), the majority of the data protection agencies are also completely incompetent at enforcing the GDPR even against companies that they can collect from.
      • msla 1125 days ago
        Because it's a non-EU company, and non-EU citizens didn't vote the GDPR into existence.

        Europe doesn't get to impose its law on other lands. Colonialism is over.

      • numpad0 1125 days ago
        Why would you want to be GDPR compliant?
        • drclau 1125 days ago
          Because in European Union it is regulation, and you (as a company) are fined if you are not compliant.

          I recommend having a look over the Wikipedia page on the subject:

          https://en.wikipedia.org/wiki/General_Data_Protection_Regula...

          • fiddlerwoaroof 1125 days ago
            If you’re not subject to the EU (I.e. don’t have any offices, servers, etc. in the EU) I don’t see how the GDPR is relevant: non-EU citizens generally aren’t subject to the laws of the EU.
            • malka 1125 days ago
              Then you cannot have ue customers. Or make wire transfer through the ue.
              • Moru 1125 days ago
                You can also forget vacation trips in EU.
              • numpad0 1125 days ago
                If thoroughly enforced, which is currently not the case.
            • drclau 1125 days ago
              "The GDPR also applies to data controllers and processors outside of the European Economic Area (EEA) if they are engaged in the "offering of goods or services" (regardless of whether a payment is required) to data subjects within the EEA, or are monitoring the behaviour of data subjects within the EEA (Article 3(2)). The regulation applies regardless of where the processing takes place. This has been interpreted as intentionally giving GDPR extraterritorial jurisdiction for non-EU establishments if they are doing business with people located in the EU."

              Source: https://en.wikipedia.org/wiki/General_Data_Protection_Regula...

              • msla 1125 days ago
                Countries or groups of countries don't get to impose their law on other countries.

                That's called colonialism, and Europe is supposed to have given it up.

                • drclau 1125 days ago
                  I am not a lawyer, and I don't claim I understand the legal mechanisms involved. I don't even claim GDPR is perfect.

                  But, as I see it, EU is protecting its citizens. If you want to do business with EU citizens you must abide by EU regulations. It's that simple. I don't get how this came to be all of a sudden about colonialism. Any business is free to stay out of EU.

                  • cortesoft 1125 days ago
                    And any EU citizen is free to not do business with a company outside the EU.

                    Do you think the EU laws should apply to people selling things to EU citizens while they are on vacation in other parts of the world? If someone from Germany travels to Brazil and buys something from a store, are they required to abide by EU rules?

                    If someone from the EU leaves the EU digitally to buy something in another country, it isn't up to the seller to enforce EU rules.

                    Unless you have an entity (either yourself or your business) under EU jurisdiction, you don't have to follow their rules.

                    • drclau 1125 days ago
                      There's an asymmetry of information and power in the relationship between a business and a citizen. Governments, generally, attempt to mitigate this asymmetry. Hence, we have consumer protection laws, GDPR and the likes.

                      While these solutions may be incomplete, or imperfect, having none is definitely worse.

                      > If someone from the EU leaves the EU digitally to buy something in another country, it isn't up to the seller to enforce EU rules.

                      > Unless you have an entity (either yourself or your business) under EU jurisdiction, you don't have to follow their rules.

                      Please _do_ read the link I already posted in a previous comment [0]. It clarifies many things, but I don't want to paste too much content here.

                      [0]: https://en.wikipedia.org/wiki/General_Data_Protection_Regula...

                      • cortesoft 1125 days ago
                        I am not sure what you are trying to argue here. I am not making any moral claim about whether a GDPR-type regulation is good or bad. I am simply saying that the EU saying the law applies outside their borders doesn’t make it so.

                        If I am a US citizen living and working in the US, and break the GDPR by storing data illegally from visitors to my website from the EU, the EU can certainly try to fine me or issue a summons or whatever they want to do.

                        However, there exists no extradition treaty for this law, and there would be no way for the EU to enforce judgement.

                        • fiddlerwoaroof 1124 days ago
                          Yeah, this is the really frustrating thing about conversations about the GDPR: whatever you think about how companies should act, legislation doesn’t really matter unless there’s some way the government can take retributive action against those who ignore it. When someone asks about what this mechanism is, you inevitably get a whole host of people assuming you dislike the legislation.
                      • fiddlerwoaroof 1125 days ago
                        This article basically confirms my suspicion that this provision is basically unenforceable:

                        http://slawsonandslawson.com/article-32-the-hole-in-the-gdpr...

                  • msla 1125 days ago
                    > If you want to do business with EU citizens you must abide by EU regulations.

                    No, no more than if I want to do business with Saudis I'm liable for punishment if I drink a beer.

                • mellavora 1125 days ago
                  I wonder when the USA will follow suit?
            • ekianjo 1125 days ago
              If some of your users are in the EU you need to be GDPR compliant.
              • fiddlerwoaroof 1125 days ago
                This is what the law says, but I don’t understand how this is expected to work: without some kind of treaty from the US government, the EU has no way to make US companies comply.
                • sneak 1125 days ago
                  The US and EU have a treaty specifically about enforcing each other's laws. (More accurately, the nations that comprise the EU are individual signatories to such treaties.)
                • mattmanser 1125 days ago
                  Have you not heard of extradition treaties?

                  For example, that's what the US is using on Kim Dotcom.

                • anonymousab 1125 days ago
                  There's a slew of individual things that can be done. EU companies can be prevented from doing business with a (willfully) noncompliant company. Wire transfers going through the EU and other operations can be blocked. And, of course, the service itself, its apps, its sites, its traffic, can be blocked from accessing the EU internet (or being accessed from it).

                  That's not even getting into international pressure levers.

                  I don't know that we've seen any of those kinds of actions yet, but they're clearly on the table if a company breaking the rules became a real "problem".

                  The thing is, if you're just completely avoiding doing any business with the EU, having any EU customers or users, and just not touching the EU with a 1000 mile pole and avoiding the GDPR in such a fashion - well, then there's no reason to go after you. The legislation has done its job.

                  • philwelch 1125 days ago
                    > And, of course, the service itself, its apps, its sites, its traffic, can be blocked from accessing the EU internet (or being accessed from it).

                    In other words, the EU can attempt to extend its internet regulations over the rest of the world by implementing a China-style firewall. Well, we'll see if that happens.

            • alvarlagerlof 1125 days ago
              If you're operating a business that interacts with customers in the EU, GDPR applies.
        • bdcravens 1125 days ago
          To avoid substantial financial risk.
          • calvinmorrison 1125 days ago
            Has the EU sued and won against any company who is not located in the EU?
            • otterley 1125 days ago
              That's not a good test, because the law is still relatively new, and it takes a while for litigation to make its way through the system. We also don't necessarily know who has settled out of court.

              Would you like to be a test case for us?

              • calvinmorrison 1125 days ago
                I would love to be a test case on it, I am not in a position.

                I'd be extremely interested for a company who doesn't operate in the EUs being brought to court and what other countries are willing to help the EU exercise those judgements if any.

        • marban 1125 days ago
          https://www.jdsupra.com/legalnews/clubhouse-app-faces-court-...

          On a side note, Germans are obsessed with Clubhouse.

      • paulie_a 1125 days ago
        They are in california. They can give the finger to the gpdr. It's irrelevant to most people in the world

        People tend to forget that it is not applicable. For instance nothing I build will ever comply to it regardless of users that might be in europe

        Clubhouse has no duty to obey european law

        The question is: why do you think the need to be compliant?

        • GekkePrutser 1125 days ago
          This is not how it works. If you make it available to EU users, you have to comply with GDPR (at least when it comes to those user's data).

          For the same reason WhatsApp's new T&Cs don't really change anything for EU users.

          However I don't think the collection of contacts is actually illegal under GDPR, considering WhatsApp does exactly this too. And it's huge in Europe, much bigger than in the US. if they haven't gone after WhatsApp for this, they will probably not do so for Clubhouse.

          • paulie_a 1125 days ago
            If they don't do business there they don't have to comply. Making it available doesn't count

            Just like I don't have to comply if I have EU users on a service, I am in the united stated. europe cannot enforce their laws here. It's just the same as if saudia arabia tried to enforce their laws here. They carry no wait

            That is what makes the GDPR insignificant. It applies to Europe. Not the rest of the world. The cookie warnings for the vast majority of the internet are stupid an unnecessary

            So call it illegal in europe but who cares?

            It honestly is maddening how many people care about the GDPR that don't need to

            • GekkePrutser 1125 days ago
              There's many EU things that take effect with vendors outside the EU. Like software sales: Try to buy a license for a software package from the EU (or with an EU payment card) and you will always be hit with VAT at the rate of your country :( Even if the company is US based only. With the exception of really small ones I guess. In the above case it's annoying for us :) But in the case of GDPR it's good IMO.

              Anyway the EU says it applies but I agree they don't really have much in the way of enforcement capability with companies that have no presence here. Though they could ask Apple/Google to remove it from the store I suppose.

              And of course most companies do have a presence here. All multinationals do, and even the smaller ones. Even if it's just a sales office.

              • paulie_a 1125 days ago
                Most American companies don't though. They can safely ignore european laws
                • TT3351 1125 days ago
                  And also choose not operate in the nations whose laws they are flouting in most cases; EDIT: a few weeks ago EU posters here were describing how ERCOT was preventing access to the company's public facing website, citing not wanting to comply with GDPR
    • xtat 1124 days ago
      OpsecLeaksHouse
    • hshshs2 1125 days ago
      please reconsider doing this next time if you’re able to
  • jpmattia 1125 days ago
    Not exactly on topic, but historical context maybe: Long ago (early 90s?) when it was guessed/assumed that intelligence agencies were scanning emails, emacs was still among the best ways to read and send email. So emacs provided a handy function to append a random list of "hot" words to each outgoing email in the signature, just to degrade the signal-to-noise of such surveillance.

    It's still there today, and you can see the output via M-x spook.

    • ianmcgowan 1125 days ago
      That used to be the case on usenet too - people would put attention-grabbing words in .signature as "NSA Food" - to overwhelm the NSA data capture algos. It seemed like a futile gesture even at the time, but particularly poignant looking back from a post-Snowden world.
      • eternalban 1125 days ago
        The real poignancy is the shift in hacker political views. Call it post-software-is-sexy world. Those usenet sigs were by hackers who lived in a world where software engineer or programmer were social reject code words. That world changed after geeks came into money. Suddenly but soon thereafter, paranoia about privacy was rewarded by tinfoil hats. (And then yes, years later, came along this guy called Snowden.)
    • shervinafshar 1125 days ago
      Such an interesting context. Thanks for sharing this. I appreciate the nostalgia poetics of this today.
  • cyberlab 1125 days ago
    Remember: some apps check for what apps are installed on the device, and if they see this installed they can deduce you're poisoning the well.

    Also if you want to research obfuscation and how it thwarts surveillance, check these:

    https://www.schneier.com/blog/archives/2019/11/obfuscation_a...

    https://www.science20.com/news_articles/obfuscation_how_to_h...

    https://www.theguardian.com/technology/2015/oct/24/obfuscati...

    https://adnauseam.io/

    https://bengrosser.com/projects/go-rando/

    • djrogers 1125 days ago
      >> some apps check for what apps are installed on the device

      I can't believe that's allowed by the OS - seems like a horrible policy.

      • foobar33333 1125 days ago
        Probably should be removed but I have seen it used legitimately sometimes. Some apps for things like contact syncing will tell you there are other apps for caldav and stuff and check if you already have them installed to not show the message.
        • Spivak 1124 days ago
          Nextcloud and DAVx5 by chance?
      • TedDoesntTalk 1125 days ago
        agreed. Id like to see a source or reference for this.
    • artwork159 1125 days ago
      If they saw this app installed, what might they actually do about me or my contact list?
      • cyberlab 1125 days ago
        They could just flag you as someone who poisoned the well and ignore you I suppose. Remember: bad actors go after low hanging fruit and tend to ignore privacy-aware folk and those doing anti-surveillance.
      • sopromo 1125 days ago
        Remove all contacts that first name and last name start with Z.

        Docs say that they prefix every first & last name with Z so that would be a start.

        • cyberlab 1125 days ago
          Also: check for contacts with weird country-code prefixes that don't match the country the user is based in
      • speedgoose 1125 days ago
        I guess they may decide to not sell your data. Which is actually a good thing.
  • ccleve 1125 days ago
    This is a common technique in the mailing list industry. It's called "salting". You add fake names, but real email addresses, street addresses, or post office boxes. You then monitor what shows up in these places addressed to "Mr. Fake Name". It's how mailing list companies monitor who is using their lists and helps control misuse.
    • sleavey 1125 days ago
      A general term for this is a "copyright trap" [1]. Map makers for example often add small, fake features to be able to tell if another map was copied from theirs.

      [1] https://en.wikipedia.org/wiki/Fictitious_entry

    • bredren 1125 days ago
      Have you worked in this industry? Curious about more details of tricks from various list makers/sellers.
      • ccleve 1124 days ago
        I have bought lists. I'm working on a system to manage lists.
  • washadjeffmad 1125 days ago
    I seem to remember CyanogenMod having a per-app sandbox feature around 2013 that returned blank info from a virtual root.

    Like many point out, this isn't data poisoning, especially if there aren't metric-breaking honeypots around the web seeding these services with enough noise to make these collection practices useless, which there are not.

    A more effective alternative might be hashing real contacts to generate seeds of complete but false profile information. Apps thinking they got the mother lode wouldn't be able to assign confidence to any results they didn't have duplicates of, and slowly over time, groups who used this would become worthless.

    • sleavey 1125 days ago
      I remember that too; it was great. That feature disappeared at some point though - it's not in Lineage OS these days as far as I've found. I recall it made some apps crash, but only as far as I could tell those that weren't robust enough to handle being fed junk data. I'm not sure why that feature disappeared.

      EDIT: my guess is that a later Android update broke the existing Cyanogenmod code and no one was maintaining it.

      • dmt0 1124 days ago
        There's XPrivacy framework that runs on top of Magisk or XPosed (not sure how it works now). I remember it allowed you to give very fine-grained permissions to apps and poison the data as well, with fake contacts, location, etc.

        Back in the day it required a lot of tinkering to set it up, and would likely make your OS pretty unstable.

  • Waterluvian 1125 days ago
    Apps using contacts is a $#%$ing anxiety attack for me. The scum companies don't care. They just want more leads. But for me, it's this fear that they're going to spam my exes and old roommates and bosses and professors and landlords and everyone who ends up added to my contacts.

    Signal did that to me last week. This person I'm not on speaking terms with got Signal and it added us and announced to each other we were on it and put our empty conversation onto my list of convos.

    Phone contact lists are a complete $&^*ing disaster and Apple needs to make it far more clear what specific contacts I share access to.

    • carmen_sandiego 1125 days ago
      Not to be unkind but I suppose most people are not really traumatised by merely seeing someone's name, even if they're not on speaking terms with that person. It probably falls on the side of convenience for the vast majority. For the Signal org, it's possibly even an existential issue, since it helps them counter network effects in the incumbents. It's hard to expect them not to do it, then.

      Having said that, I think it would be nice for Apple to implement what you describe.

      • aboringusername 1125 days ago
        > but I suppose most people are not really traumatised by merely seeing someone's name

        I mean there are cases where that can be devastating.

        "Ohai here's your old abusive ex, here's a chat box just for good measure, good luck!".

        There are people who I'd never ever want to be within a textbox and tap away from accessing me, for any reason, period.

        You can get restraining orders in the physical world, the digital world however has no boundaries when the apps themselves are too stupid and are defined by real-world-illogical programming code. I wouldn't expect an app to understand a 'court order' but that's a real human construct. How do we design against that in the digital space, when you are so accessible that if you have a crazy dude following you you're basically forced to retreat as there's no effective measures/guards against this?

        • carmen_sandiego 1125 days ago
          Well, a couple of things:

          (a) You can't take seeing their name, but you keep them in your contacts? Don't you occasionally scroll past it with a call button right there, which is just as easy to hit and put you in touch with them? How is this any different? Seems a bit silly.

          (b) As far as I know, research suggests hyper-avoidance is not a good way to resolve trauma. So I'm not convinced by the idea that this is harmful, especially when you can control it through (a).

          • Waterluvian 1125 days ago
            A contact list often operates as a database of what number belongs to who, for guarding incoming calls. It can be a security tool.
            • nvr219 1125 days ago
              In iOS and Android, incoming call blocks are in a separate database and explicitly not the contacts database.
            • carmen_sandiego 1125 days ago
              You can generally block calls by number, without having them as a named contact.
              • lucb1e 1125 days ago
                I do see Waterluvian's point though. You might still have business with them yet you don't really want to deal with them otherwise. Knowing who this SMS or call was from can be helpful rather than blocking the number outright.

                Then again, seeing their name when installing Signal and figuring "oh hey they have signal too" seems no less weird to me than seeing their name in my phone book and thinking "oh hey they have a phone too". If that really sets you off... that seems unlikely. So I don't really get this subthread, even if I see the general point that you might not want to be reminded of certain people on a regular basis (for me, installing a phone number-based social application is not a monthly occurrence).

          • heavyset_go 1125 days ago
            > You can't take seeing their name, but you keep them in your contacts?

            If I start getting abusive calls or texts from a usual suspect, I want to know who it is. My carrier-level number blocking resets every couple of years, and I cannot remember everyone's phone numbers.

          • the_local_host 1125 days ago
            Even if you don't keep them in your contacts, the connection tracking can be problematic if they keep you in their contacts.

            "But what if you didn’t give Clubhouse access to your contacts, specifically because you didn’t want all or any of them to know you were there? I regret to inform you that Clubhouse has made it possible for them to know anyway, encourages them to follow you, and there isn’t much you can do about it... I got followers who weren’t in my contacts at all — but I was in theirs."

            https://www.vox.com/recode/22278601/clubhouse-invite-privacy...

          • musingsole 1125 days ago
            Why do you have the authority to dismiss many's experience of a feature? Because you can think of a way you would handle it and you've read some things?
            • carmen_sandiego 1125 days ago
              Because we're all here talking about how things should be designed, which often inherently requires fulfilling some needs at the expense of others? Not quite sure how you expect those decisions to be made without people gathering to discuss the relative merits of each approach.

              If you're about to tell me we should just implement every user request that they claim is of 10/10 importance to them personally, then I'm not even sure what to tell you. Have you taken all of a few seconds to consider what happens when two people make conflicting requests? Then we're back to evaluating things and discussing them again. How arrogant of us.

              I appreciate the implied authority you've given yourself to be the conversation police, though.

      • myself248 1125 days ago
        In my case it wasn't traumatic, exactly. More, targeting.

        There was an individual that I kept in my contacts, you see, for the the sole purpose that if he ever called me, I'd know to let it go to voicemail. We had been close long ago, but he stopped living in consensus reality and wasn't interested in treatment. I considered him disturbing but not immediately dangerous, just someone I didn't want to reconnect with.

        When I installed Signal, he got the notification that I had done so, and immediately messaged me, along the lines of "Oh hey, you still exist! And I guess by the timing of this install, you must be at [security-focused event] this weekend, yeah? Hey let me tell you about my latest harebrained scheme..."

        I understand that Signal needs to do that sort of connection to work behind the scenes, but they don't need to generate an alert on the guy's lock screen about me.

      • nathanfig 1125 days ago
        "Did this cause trauma" is not the bar we're trying to set here, any level of anxiety caused by tech companies misusing contacts is bad.
      • heavyset_go 1125 days ago
        > Not to be unkind but I suppose most people are not really traumatised by merely seeing someone's name, even if they're not on speaking terms with that person.

        Domestic abuse, harassment/sexual harassment, stalking etc are all more common than they should be.

      • ficklepickle 1125 days ago
        I've got a dead friend that I'm reminded about every time I open signal. "DeceasedFriend is on signal!". No, no he is not.

        I'm sure I could clear it, but I don't really want to yet.

        On the whole, I still like the feature.

        • carmen_sandiego 1125 days ago
          I'm sorry about your friend. I've had similar experiences with tech products, but I tend to think that unexpected reminders (of any kind) are all part of the process of dealing with loss. That hyper-avoidance seems an unhealthy route, popular though it is in modern discussions about emotionally difficult subjects.
      • Waterluvian 1125 days ago
        Yep. I can't claim to know how everyone else responds to these things.

        The Signal example isn't the worst. It's a mutual connection. It's not like they're emailing hundreds of people saying "Waterluvian wants you to get on signal!"

        What's to stop them from doing that when they get sufficiently desperate? I don't even own my contact lists. They seem to grow on their own with anyone I've ever emailed.

        • sneak 1125 days ago
          Signal does it for anyone in your address book, not just mutuals.

          Your "anyone I've emailed" example is a great reason not to use the same service you use to host your email to host your contacts.

          Personally I would never in a million years sync my contacts to Google, which I assume is what you mean here (most people use gmail).

          • Waterluvian 1125 days ago
            Probably. Contacts have been confusing. I've had Gmail list. My phone. What's in my Sim card. My Sony contact list...

            I had a really infuriating time trying to clean them all up many years ago and I've just tapped out.

            • ficklepickle 1125 days ago
              Same here. I recently went to LineageOS and use fastmail for email/contacts/calendar. It's been wonderful.
      • laurent92 1125 days ago
        The problem I have with Whatsapp is even more than Signal: Not only they engage me to start a conversation with that customer to whom I only wanted to appear super-stern and rigorous, but they also send them my profile photo and my name!

        My business name is not my private name! At least let me remain under my name in their address book, don’t give them information.

      • crossroadsguy 1124 days ago
        Signal shows contacts (and just bare phone numbers as well) inside the app which have not been in my contact list for years (but once were).

        And this is how Signal suggests doing it https://support.signal.org/hc/en-us/articles/360007319011#io...:

        > Remove someone from your Signal contact list

        > Contacts must be blocked in order to be removed from your Signal Contact List. To learn how to block someone, click here.

      • foobar33333 1125 days ago
        I wish telegram had a setting for "Block everyone in my contacts list" Unfortunately it only seems to have the reverse
    • tchalla 1125 days ago
      Does Signal share contacts the same way others like WhatsApp does?

      https://signal.org/blog/private-contact-discovery/

      > Signal clients will be able to efficiently and scalably determine whether the contacts in their address book are Signal users without revealing the contacts in their address book to the Signal service.

      • lucb1e 1125 days ago
        Note that this SGX thing is broken seven ways from sunday, but in principle, yep they have some security measures here. We just have to trust them not to crack their SGX environment as well as (regardless of SGX' security) Intel not to generate an identical MRENCLAVE for anyone else but with additional logging code running inside.

        This is the best system I know of anyone running, by the way. Threema, Wire, etc., nobody else has this (but then neither requires a phone number, so...). I also don't know of a better way to do phone number matching than having a trusted third party that bakes their private key into chips and verifies that you're really talking to the code you think you're talking to. The upsides of DRM technology!

    • purpmint008 1125 days ago
      About that Signal thing: Did that other person actually get a conversation starter message of some sort?
  • rasse 1125 days ago
    This makes me wonder if anyone has set up canary emails or phone numbers in their phone contacts.
    • rsync 1125 days ago
      "This makes me wonder if anyone has set up canary emails or phone numbers in their phone contacts."

      We (rsync.net) have a handful of dummy/fake users in our database whose emails we monitor. The email addresses are cryptic and random and use a different domain, etc.

      We should never see an email sent to one of these "canary" email addresses and, so far, we have not.

      I am also aware that many of our customers sign up with service-specific email addresses, using the '+' character ... something like john+rsync@mydomain.com.

      I personally have a rich and well developed pseudonym that I use for all online non-governmental transactions but in some rare cases I need to use my actual name and email - and in those cases I create '+' aliases.

      • techsupporter 1125 days ago
        I've noticed a bunch of spammers starting to strip out anything after the + and before the @. This is why I've long used a catch-all e-mail domain (subdomain.example.net) where I can put anything I want to the left of the @ sign and no one is the wiser for my real e-mail address.
        • Answerawake 1125 days ago
          Is there some service where I can easily create unlimited custom email addresses for a flat monthly fee? I want to use a unique email for each new website/service. That would go a long way to solving some data leak/privacy problems. The problem with custom domain is I have to maintain it right? I want a service which I don't have to maintain. I used to use new Yahoo accounts but they are a hassle and recently they disabled free auto-forwarding.
          • osamagirl69 1125 days ago
            You can do this with any email provider that supports a catchall. I personally use fastmail and have been very happy with it. You don't need to 'create' the accounts, you just set it up so that *@yourdomain.com goes to your catchall. When signing up for a new service, you pick a unique/random email. Then you know unambiguously where each email in your inbox came from.

            I personally use the website as the email (example, if HN required an email it would be hn@mydomain.com) to make it easier to filter. But this can be gamed/guessed, to be more secure it is better to generate an actual random email for each site and store it in your password manager.

          • rzzzt 1125 days ago
            Mozilla has such a service: https://relay.firefox.com/

            I also remember seeing one Show HN recently that offered similar functionality, but couldn't find it via search. The problem is that if the e-mail alias provider becomes popular enough, their subdomains are soon disqualified from being used when registering to sites.

          • notfed 1125 days ago
            Protonmail allows wildcard emails from 1 custom domain if you pay for the ~$5/mo plan. No maintaining a mail server, just point your MX records to their servers.
          • wuuza 1125 days ago
            spamgourmet.com

            I have been using this since 2002. You don't even have to set anything up - just make up addresses on the fly. It's pretty awesome.

          • Nextgrid 1124 days ago
            I believe Fastmail supports that.
    • praptak 1125 days ago
      What do you mean by "canary" in this context? How do you detect that the canary is dead?

      I assume that the "canary being dead" ~= "an adversary added the contact to their watch list". But I don't think you can detect that.

      The best you could do is to add a random physical address hoping that you can detect physical surveillance (which is probably not realistic anyway).

      • rzzzt 1125 days ago
        It is like signing up with an e-mail +suffix for services, or the non-existent streets on digital maps; if you come across your fake contact elsewhere, you know that information has been shared.
        • vmception 1125 days ago
          it is trivial to strip suffixes off of aliased email addresses
          • kogir 1125 days ago
            If you control your own email routing, by using your own mail server, Google Workspace, Microsoft 365, etc, you can choose whatever convention you want.

            How would you know to strip everything after my first name?

            • vmception 1125 days ago
              I wouldn't care about the people using their own mail server

              I would just strip everything after a + sign

              • kogir 1121 days ago
                Can't strip it if it's not there. Everything after my name is the comment, and you have no way to know.
          • rsync 1125 days ago
            "it is trivial to strip suffixes off of aliased email addresses ..."

            This actually is not a bad point to make ... it would, in fact, be simple to strip +aliases but ... economically I don't think it makes any sense.

            You'd have to have a high level decision maker dictating an engineering fix in order to increase email authenticity by ... .01% ?

            ... and that assumes that the "engineers" down the chain understand how '+' works in email to begin with and have somehow communicated that back up to management.

            • vmception 1125 days ago
              My response here is that I think this discussion is naive, as the data brokers themselves already do it.

              So who cares about what some engineer at a random new business thinks.

              Aliasing isn't new. So this isn't a cat and mouse game that just got started.

          • rzzzt 1125 days ago
            What is the equivalent to that in the fake phone contacts domain? I guess removing people with the +21 country code would work for this particular approach, but otherwise...?
            • vmception 1125 days ago
              Good question hmm, I think its just a different strategy with phone contacts

              A data broker primarily wants the social graph to make a user profile with a phone number, to show ads later on. Those people wont typically be texting or calling with spam and ads, theyll just match the number and contacts up with information shared in other apps so that ads in your normal internet browser or ad-include app use are more targeted

              so if an erroneous contact never logs in thats of no consequence to them, so searching to exclude numbers would be less interesting and less likely than with just sanitizing emails

        • rasse 1125 days ago
          Exactly!
      • rasse 1125 days ago
        Detection would require a call/sms/email. The idea would be just to detect, if your leaked data has been acted upon.
    • KirillPanov 1125 days ago
      The robocall epidemic has pretty much made the notion of "canary phone numbers" useless.
    • arminiusreturns 1125 days ago
      I create a new email for most services I use, (run my own email) but I had'nt thought of this! Thanks for the idea.
  • geniium 1124 days ago
    It's pretty sad that we get to this point. Creating fake contact in our phones to create "data-poisoning".

    Where the hell are we going?

    • anotherfish 1124 days ago
      Going by the canary email addresses I put into my devices from time to time... nowhere good. Those email addresses receive spam despite never sending or being signed up to anything. Apps are actively uploading and selling email addresses. I'd not be surprised if some Big Data company has a massive graph of mobile numbers / email addresses sourced purely from app uploads, let alone reasonable signups. Then its all correlated with other sources like linkedin. Yay! Profit!
  • augstein 1124 days ago
    How far have we come to even consider poisoning the data on our own devices, because we know it will be harvested and resold by 3rd parties.
  • vmception 1125 days ago
    To everyone talking about Clubhouse, there isn't an android version so this code is not useful as it is only for android
    • throwawei369 1125 days ago
      TIL The vast userbase of HN is 95% Apple, 3% Android, 0.0005% Pinephone. The remaining ~1% don't even make a digital footprint since they use the old Nokia 3310 type phones.
  • annoyingnoob 1125 days ago
    I'm of the opinion that personal data is not like a currency and should not be seen as a form of currency.

    If you want to barter then I want to negotiate, no one sided contracts. Can't make a deal? Your loss then.

  • floatingatoll 1125 days ago
    Is it possible to create a network of contacts that triggers worst-case memory and cpu scenarios when the network is reconstructed from contacts?

    Or, put another way, can a collection of people doing this construct a set of synthetic contacts spread out in various ways across their devices, such that anyone doing contact analysis sees their analyses slow down, drain resources, or crash altogether due to network structure?

    • alcover 1125 days ago
      Wouldn't any worthy graph explorer handle cycles ?
      • floatingatoll 1125 days ago
        If I had a nickel for every time an algorithm was found to have an exploitable weakness due to unforeseen alignments of input, I’d certainly have some nickels. We know what the common screwups in crypto are, and we could know what common screwups in network graphs are. I’m just wondering if anyone actually does know of any of those.
  • neilv 1125 days ago
    > The app is designed to be very simple and fail silently. If you deny permission to access contacts, the app will not complain, it just will not work.

    I don't understand the reason behind "designed to...fail silently" in this way, in a privacy&security measure.

  • aboringusername 1125 days ago
    Can someone please explain to me how the collection of contact data is in any way legal under the GDPR and why Microsoft (Windows), Apple/Google haven't been required to make changes to prevent abuse of this permission (such as selecting specific contacts).

    I'd also like to not know why if my contact data is shared, I am not informed of this. If my data is uploaded by Google to their servers, I should know. If somebody chooses to share my data with $app I should know, and, be able to "opt-out" of being included, perhaps (although it should be opt-in!)

    Being able to mass collect what is often the most sensitive information means that consistent data is now a liability; keeping the same number/email can be useful for cross-referencing. Ideally you should rotate what data you can (physical address/location is obviously extremely difficult). Everything else is possible (browsers/IP addresses/emails/User Agent strings, phone numbers etc etc)

    The best idea is to "troll" with your data; put insane items in your logged in basket (ebay/amazon etc), like sex toys. You can even make an order (and refund it) to further poison the well. Log in to Google and do some disgusting searches, and train algorithms to have the "wrong idea" about you, this is a reality we're now facing as this data can (and will) be used against you at any opportunity.

    • JCDenton2052 1125 days ago
      The best idea is to not use their services. Switch from Windows to Linux, de-google and if you must use Android keep the data on your phone to a minimum.
    • djrogers 1125 days ago
      > and why Microsoft (Windows), Apple/Google haven't been required to make changes

      I don't believe there's anything in the GDPR that gives it the ability to regulate entities several steps removed from the violations. If company A uses a posted letter to ask for PII then stores it in violation of the GDPR, would you then regulate the post office?

    • Nextgrid 1124 days ago
      The data protection agencies who are given the power to enforce the GDPR are completely incompetent at doing so, or are corrupt and silently benefit from the status-quo.

      The web is filled with various dark patterns (and even companies whose entire business is to provide such dark patterns as a service) that would fall afoul of the GDPR, and yet nothing is being done.

      Google and Facebook - part of the most popular websites worldwide, and with business presence in all the major EU countries - have had a non-compliant cookie/tracking consent flow for over a year now and nothing is being done.

      If even blatantly obvious breaches (which would be trivial to litigate) are ignored, something more murky like Apple (a neutral third-party) merely providing a tool that can be used to violate the GDPR (but also can be used legitimately) has no chance of making any progress.

  • bschne 1125 days ago
    The problem with this approach is twofold:

    a) At the margin, a few people doing this does _nothing_ to mess with big companies' data collection & analysis. But opting out also has the same problem, obviously, so at least it's not doing worse.

    b) In the absence of sandbox / selective sharing features like other commenters have mentioned, or you going so far as to _only_ keep fake contacts in your phone, using this approach requires you to also share your actual contacts with the app, thus giving away PII of unconsenting third parties. Yes, I'd rather blame the app developers for collecting this data in the first place, but I'd still prefer not to give my contacts away whenever I can reasonably withhold them.

  • aasasd 1125 days ago
    On Android, IIRC I've seen a dialer app that stores contacts in its own database instead of the system thing. Seems to be a better approach than this—at least if other apps also don't write to the shared contacts.

    (It was probably an open-source dialer on F-Droid, but don't remember exactly which one.)

    Anyway, an even better approach of course is to tell data-slurping apps to bugger off.

    Edit: come to think of it, maybe alternative Android ROMs could fence the contacts so that an app only sees its own unless the user specifically selects someone. I guess this is similar to Apple's trick with Photos.

  • collaborative 1125 days ago
    Phone numbers are too public. The reason why they're used by messaging apps is that they are a goldmine to have. They actually make it harder to chat (ever tried using Whatsapp/Signal on a PC? Yes, you'll need to have it installed on your phone first (and have given over your contacts))

    That's why I chose to set (masked) emails as the primary id on groupsapp.online and even these can't be seen publicly unless you share a "group". Others will just see XXXX@gmail.com

    • Scaless 1125 days ago
      At that point why even show the masked emails at all? All you're doing is leaking people who use their own custom domains (e.g. XXXX@myname.com), and XXXX@gmail.com isn't going to help me know who that person is either.
      • collaborative 1112 days ago
        Touche. I shall also hide the domain then, thx for the tip
  • atum47 1125 days ago
    You can always use bash or python to create vcards and import them in your phone.

    I've used this technique once to generate a bunch of numbers to find the whatsapp of a person, works just fine

  • 0df8dkdf 1125 days ago
    That is why we should have a custom app for contacts with custom encryption (like keepass) to store our real contact. So not any app or apple or google has access them.

    For some ppl like political of activist fundraisers, contact info privacy are utter most important. In fact some of them still store it on rolodex, and will not put any of that into digital form. And as a software developer I actually support that tremendously.

  • adsharma 1125 days ago
    I wonder if people have thought about another variant of this. An app that maintains two address books and switches between them based on context.
    • tanelpoder 1125 days ago
      Or just some form of "share only these contacts with app X" option at the device system/OS level.
      • adsharma 1125 days ago
        Given the tracking cookie situation, apps could refuse to install if that option is turned on. They can easily detect if they see a small number of contacts relative to average.

        With the two address book solution, they should have no way of telling which one is the real address book.

  • nbzso 1125 days ago
    All the shady data schemes and dark patterns in todays idea of software business motivated me to look to my phone as an enemy and using the web cautiously all the time. Actually the idea of hyperconnected future in which 24/7 monitoring of the individuals will be normalised and mandatory makes me cringe. The Internet from force of good is turning to dystopian toolchain by the hour. And all is because we as society cannot find an effective way to limit the greed.
    • throwawei369 1125 days ago
      Wait until iot becomes mainstream. I foresee tiny chips creating mass scale mutiny against their creators and colonizing us (best case scenario)
      • shervinafshar 1125 days ago
        I wonder how dystopian sci-fi would read in such future? I mean...what would be their parable of The Matrix?
        • throwawei369 1125 days ago
          You joke. But what if we we're playing right into their game and robot resistance is already underway. What if there's more to the vaccines we're injecting into ourselves? Is Bill Gates even a real person or just a simulation?
    • wruza 1125 days ago
      Because some [ˈklʌbhaʊs], a shitty app promoted and used by hype-flex-and-chill type of “people”? Just let them be and move on, what do you think you miss there? If you see them as a source of income, a second job-only phone is a must anyway.
    • Klwohu 1125 days ago
      The Internet was designed to be dystopian before it was even technically implemented.
    • federona 1125 days ago
      Society, current society also called capitalism, is designed not for greed but constant growth. When your goal is not satisfaction but constant growth and you already are a billion dollar company, then it makes you look at all the shady shit you can still do and get away with in order to grow. These companies don't need to grow, if anything they actually should be growing smaller and sustainable if we actually wanted to engineer towards goodness rather than money. The fact that the rich are getting richer while having absolutely no need for it says to me our prerogatives are wrong and our engineering about business is wrong. A lot of the common laws rules and norms around which business is built are insane.

      That is to say that if the economy is a mirror of nature, then businesses should be engineered to die. Not to be a going concern forever. After a certain amount of profit is extracted and life is lived, into the grave they should go. Not just as a result of competition, but as a result of system design.

      This would then lead to a more evolutionary world and better distribution of power and resources rather than continuous monopolizing and consolidation. Also a different mentality of you can't take it with you to the grave, rather than infinite mindset. It would be a cyclical mindset about finite things, not infinite things. Corporations want to be people, so engineer them like people and less like machines.

  • nom 1125 days ago
    Hm can it be estimated / is there public information about how many phone numbers are taken? E.g. I generate a valid number for one country or state, how likely is it that the number is in use or registered?

    I once got a phone call from a university student for a survey for their project and they told me they generate them randomly which makes me really wonder, how likely is it?

  • ckgjm 1124 days ago
    Things might change big time in this space. Apple and Facebook are slugging it out on tracking personal data. https://www.cnet.com/news/facebook-vs-apple-heres-what-you-n...
  • 2Gkashmiri 1124 days ago
    Ever since ios 5 I think, there has been permission control behind a separate password. Why cant android replicate that? Nowadays, there seems to be one "permissions protection" but sadly all apps say "you seem to have contacts protection enabled. Please disable for best results". Whats the point?
  • I_am_tiberius 1124 days ago
    Just reading the headline and thinking: Providers (like google) may still be able to filter out fake profiles if those fake contacts don't have relations to each other. Meaning that if only you have a contact with a random number, and nobody else has, it's most likely fake.
  • nvoid 1125 days ago
    I was looking through my contacts the other day, deleting some people I don't speak to any more. Its interesting that with 5 or so unique enough contacts I could be identified. If they were sufficiently unique, no one in the world could possible know those 5 people. Scary thought.
  • fsflover 1125 days ago
    Or just stop using operating systems and apps which you don’t trust and switch to GNU/Linux phones.
  • aww_dang 1125 days ago
    Imagine if your fake contact's randomly created email or phone number is on a terror watch list.
    • praptak 1125 days ago
      I think that's exactly the point of this. I remember people on Usenet posting random shit like "construct bomb kill president" when the news about Echelon came out.
    • corentin88 1125 days ago
      The documentation states that it uses a non-allocated country code (+21). So it seems unlikely to happen.
      • toast0 1125 days ago
        +21 isn't allocated, but

           +211 South Sudan
           +212 Morocco
           +213 Algeria
           +216 Tunisia
           +218 Libya
        
        Someone putting random numbers after +21 because it's unallocated has a fundamental misunderstanding of international phone numbers.

        But also, the server side is likely to throw away invalid numbers to start with. It's simple and easy to do, and reduces the data storage by a lot (there's a lot of garbage in people's address books)

      • dustymcp 1125 days ago
        Doesnt this defeat the purpose tho as it could be filtered?
        • o-__-o 1125 days ago
          The us government monitored all DC residents personal communication for over 2 years because they fat fingered the collection regex. The country code for Egypt is +20, the DC area code is 202.
          • IAmGraydon 1125 days ago
            You think that was a mistake, huh?
          • grandinj 1125 days ago
            That is a mistake that sounds suspiciously self serving, given how many powerful people live and work there
        • 0x426577617265 1125 days ago
          Yes, this data could be quickly mitigated.
  • naebother 1125 days ago
    How does this help me? Malicious apps are still going to scoop up my real contacts, right? What if one of the random phone numbers belongs to someone deemed a "terrorist" by one the imperial powers and I'm judged guilty by association?
  • GekkePrutser 1125 days ago
    I wonder if this works at all..

    These companies simply use your contacts to do contact mapping to other users. Including fake ones will do nothing as they don't point anywhere. Big Data will just filter them out.

  • _trampeltier 1125 days ago
    I have no contacts at all on my phone, I created something by myself. Now I think it would be funny to brute force Androids contacts and just add every number of my countrys phone providers :-)
  • IncRnd 1124 days ago
    If real problem is that your contacts can be stolen, it makes no sense to add noise to them instead of securing them.

    Do you install lots of trivial apps, which you give permission to access your contacts?

  • yalogin 1125 days ago
    This is not achieving anything positive. I don’t which privacy threat it’s fixing, other than adding a new app into the mix that could at some point in the future suck up the contacts itself:)
    • alcover 1125 days ago
      That is pretty insightful. Do you have an email ?
      • yalogin 1124 days ago
        Ha that is a good one.
  • tyingq 1125 days ago
    Bsd style globbing is handy for this sort of thing. Like in Perl:

      use File::Glob qw/bsd_glob/;
      my @list = bsd_glob('This is nested {{very,quite} deeply,deep}');
  • the_local_host 1125 days ago
    I have to say the spirit of this fake_contacts app is very appealing. Why stop at defending your data, when you can attack?
  • dredmorbius 1124 days ago
    Data-poisoning is an attractive approach (and one I've considered and occasionally practiced) but it does relatively little as a practical matter. @cyberlab posted some good links here: https://news.ycombinator.com/item?id=26286686

    Cory Doctorow also addressed this in a recent Reddit AMA: https://old.reddit.com/r/privacy/comments/j444u4/how_to_dest...

    From the harvesters' perspective, a long list of pattern-matching identifiers with no visible history anywhere else online ... will tend to get junked fairly readily. A small increase in undeliverable addresses from a swipe ... won't increase costs much.

    Creating wholesale online personas (effectively: bots and troll farms writ large) might start posing a challenge, but those would still likely give off a strong signal of falseness due to lack of correlation with other identifiers, most notably devices of their own, credit cards or other payments data, other data-linked services (transit or toll passes, etc.).

    Ultimately the question is why are you doing this and what do you hope to accomplish?

    (Though I've salivated a few times contemplating a system that would stream endless bits as responses to cookie or similar requests, just for shins and grits.)

  • paulie_a 1125 days ago
    Data poisoning needs to become a standard practice. Make the "valuable" ad data useless
    • tjpnz 1125 days ago
      From an economics perspective it seems like a more viable approach. Most of the techniques considered state of the art now are likely easily detectable by Google and other ad tech companies - they have a very good idea of which data can be safely discarded. Rather than blocking Google Analytics I wonder what would happen if browsers started responding with garbage.
    • throwawei369 1125 days ago
      Couldn't agree more. It's a far better approach as a cloaking technique. Reason I use privacy-possum addon on Firefox.
  • 867-5309 1124 days ago
    slightly increased anonymity through user-fed obfuscation? if you don't want an app to access your contacts, deny it. if it insists, delete it. the only app on my phone which has access to contacts is.. Contacts
  • neonihil 1124 days ago
    I love this! We need more of this. Let’s poison every data that is collected!
  • heavyset_go 1125 days ago
    This can be easily bypassed by cross referencing contact lists on the backend.
  • ezconnect 1125 days ago
    Why not jut create a contacts app and use that as your private contact apps.
  • ketamine__ 1125 days ago
    Is there a limit on the number of contacts Clubhouse would sync?
    • CharlesW 1125 days ago
      It's incredibly unlikely. This kind of social graph information is gold.
      • lanstin 1125 days ago
        I suspect it is less valuable than call logs. I have never deleted contacts so I have over twenty years of entries with pretty low value (e.g. call this number to find out about this real estate offering; my old mechanics for on 2003 old phone number) or accuracy. I only call about seven people but those are significant links.
  • sanxiyn 1125 days ago
    What a great idea. Let's do more of these.
  • ianlevesque 1125 days ago
    Or click “Don’t allow”? What’s the point?
  • andix 1125 days ago
    Just don't share your contacts with apps that steal them and use them for marketing purposes.

    It is also illegal to do it (GDPR), if you don't have the permission of every single person in your contacts.

  • jp57 1125 days ago
    Can we get little Bobby Tables in there?

    https://xkcd.com/327/

  • ficklepickle 1125 days ago
    Sad state of affairs. AOL couldn't kill the open web, but "apps" have.

    The user agent should respect your wishes, but instead we are reduced to this insane work-around.

    Surveillance capitalism needs to die in a fire. To anybody working on that shit: I hate you. Personally, as an individual, I wish you harm.

    OK, that was hyperbole, but I do love the open web. RIP.

  • parkingpete 1125 days ago
    Hmmm, not good
  • antihacker_team 1125 days ago
    Vulnerabilities research. OWASP, PTES, NIST SP800-115. You pay only for the found bug, depending on the criticality. Over than 6 years of experience. email: info@antihacker.pw
  • ouromoros 1122 days ago
    I recall the article that said obfuscation is the final solution to our privacy.

    But do we really want this solution?

  • purpmint008 1124 days ago
    This is getting out of hand.
  • championrunner 1125 days ago
    Do you have a running APK ?
  • williesleg 1125 days ago
    Give me your data now!
  • allurbase 1125 days ago
    Take me to your leader.... I don’t like thieves!