I lived on the west coast for 15 years, mostly in California, and in my experience Comcast/Xfinity is one of the worst companies I have ever encountered as a consumer. Not only is the service unreliable, the company tries to slap you with charges when you call to report a problem. One day I saw a Comcast technician working outside my apartment, and as he was leaving, my internet service went down. I called Comcast, they sent someone out a few hours later, the internet was restored... and then, when I got my monthly bill, there was an additional charge for $29 or something, iirc. When I called customer service to ask them to remove the charge, they transferred my call several times and put me on hold for several minutes every time they transferred the call. Finally, after a hour on the phone, they agreed to remove the charge.
Since then, I have moved to Austin, where there are two large Internet Service Providers competing against each other. I mostly use Spectrum; it is not perfect, people here complain about it, but it is incomparably better than Comcast.
Here in Austin, the city council no longer allows Flock ALPR's (automated license plate readers) on city streets, but Home Depot and other businesses still use them in their parking lots, and they scan your vehicle license plate every time you enter and exit the premises. Flock sells its data to ICE and law enforcement.
Plus they'll position them close to an intersection in the parking lot of a business so they can get around something like the restriction Austin put in.
I don't like that this is the case, but you understand that a pretty huge fraction of the country doesn't share your set of political premises that providing data for immigration enforcement is unethical, right? (I do, but that shouldn't matter for the analysis.)
It seems weird to me to hyperfocus on Flock's role here rather than the role your own local municipalities play in deciding how to configure these things. Not sharing with ICE is apparently quite doable? At least to the point of requiring a court order to get access to the data, which is a vulnerability all online cameras share.
As the CEO of Flock, don't you feel you have more information to offer this community outside of the "we do not sell data" statement you've made over and over? The fact that you do not engage here in the ethical aspects of your product doesn't look good for you and only deepens suspicion that something darker is going on behind your doors.
The comment adjacent to mine links to several findings, including from the EFF, demonstrating doubt on your assertions here. Specifically the case of Texas using Flock data outside of their jurisdiction (on a national level even) to use against abortion seekers. You have no substantial comments to make on those or any of the other active discussions that have spawned on this platform over the past year? You're obviously reading them, yet you only remain "consistent" on a technicality.
What steps is Flock taking to address the privacy overreach? Do you have data sharing agreements with Palantir? If so, do they respect the same geofencing properties that your clients supposedly have full control over?
He's not arguing that the data isn't shared. He's saying that they don't sell it. Local PDs generally want to share their data with other law enforcement agencies.
That is irrelevant to my comment. Yes, it's abundantly clear what he's saying, he's said it so many times already, I don't need to read it again. I'm asking why he's not contributing more to the actual discussions surrounding his product instead.
Every community in the nation that is home to Flock cameras should look at the user agreement between their police department (or other Flock customers) and the company, to see whether it contains a clause stating that the customer “hereby grants Flock” a “worldwide, perpetual, royalty-free free right and license” to “disclose the Agency Data… for investigative purposes.” This is the language that will govern in a community unless a department demands changes to the standard user agreement that Flock offers. That is something we absolutely urge any agencies doing business with Flock to do — and, the ACLU of Massachusetts found, is exactly what the Boston police department did.
---
What assurance does any member of the public have that your company does not and will not ever share data to which you claim a "worldwide, perpetual, royalty-free free right and license" to? Are you saying that the "customer" has the ability to choose a "do not share" flag or something? What happens when they flip that flag at some point in the future? What redress does a victim have if you share data you did not, at that point in time, have permission to share?
You are selling tools that have zero upside and a lot of downsides and that are used for structural violation of the privacy of citizens. Don't hide behind that you're trying to help people stay safe, that is not what you are doing and if you believe that you can take credit for the upsides then you really should take responsibility for the downsides.
The problem isn't zero upside, as other commenters have pointed out. The cameras have legitimate, lawful, and useful purposes. You will not gain any traction with the public or with lawmakers as long as your arguments ignore that reality.
The problem is that the downside is unbounded.
We clearly don't have the control over our governments, in either direction or degree, that would be needed to ensure that the unbounded downside of ubiquitous networked cameras won't manifest itself.
What's the upside then, since it is so clear to you? Show me the stats on how these cameras actually reduced crime instead. Because to me they only show a possible decrease in one form of crime and a guaranteed increase in another.
Looking at your user page, I don't imagine you park your car on the street, do you? A lot of people have to. When (not if) it gets vandalized or stolen, it's nice to be able to identify the perpetrators and hold them to account.
Of course the rest of the justice system has to be firing on all cylinders to make that happen... but still, when you're a crime victim, more information is better than less.
> Looking at your user page, I don't imagine you park your car on the street, do you?
Yes, I do. And I've even had one stolen. And even that isn't enough to persuade me that putting cameras everywhere is going to make us safer. People are scared of their own shadow, it makes zero sense. Theft and other crime is as old as humanity, it is a delusion to think that living in the panopticon is going to make you save from small crime. But what it will do is enable much bigger crimes.
As far as my car: we have this amazing thing called insurance. And they were most reasonable when my car was stolen and yes, I'm still pissed off about it. But cameras would not have stopped that.
Car theft tends to be perpetrated by a small number of repeat offenders. Cameras would indeed have helped in your case... but only if they were installed in the last neighborhood where the thieves were active, if the police used the evidence to track them down, if the prosecutor's office used the evidence to charge them, and if the courts used the evidence to lock them up.
Admittedly those are all big leaps of faith around here, where car thieves are handled on a catch-and-release basis and where we usually don't even bother with the 'catch' part. You could argue that law enforcement doesn't need any new toys if they don't use the ones they already have, and I certainly wouldn't disagree with that.
I think a lot depends on who owns and controls the cameras. I'd object to ALPRs being installed in my rural neighborhood, certainly. But I see little other than upside in private security cameras whose footage I can choose to share with the police, or with anyone else for that matter. Which is why that's what I have.
At the same time, cameras in urban settings are much less scary and offensive to me for some reason, partially because I disagree that anyone has any expectation of privacy in such settings, and partially because I believe that ship has sailed and anyone bothering to object is just wasting their breath.
The best we can hope for is aggressive public oversight of such cameras. The company itself can't be expected to show any leadership in that area; it has to come from us.
Sure, but that's exactly where it fails: that oversight. So you end up with all of this data in the hands that you least want to have it, and never mind the criminals that gain access to it in the inevitable data leaks and then all of that data gets used against you.
There is zero correlation between these cameras being installed or not and crime incidence rates or the number of cases solved.
Ironically, what did reduce crime - considerably so, even - was COVID. But I don't see anybody arguing for a curfew to reduce crime either.
I'm looking for convincing decoy ALPR cameras because I don't think my HOA will go for a real setup, and I've got concerns over the product's security. I want the appearance of surveillance if I can't get the real thing. Being on a Flock/ALPR tracking app/site would be a huge win.
There is no benefit to signaling one's virtue in this scenario. It's like having a sign in your yard that says "Proudly Gun-Free Household".
> My neighborhood is very safe and we have no such cameras.
Good for you.
> why do you think cameras are the only solution?
Straw man.
I want to deter criminals from even thinking about targeting my neighborhood. The appearance of surveillance might serve as a powerful deterrent. Inclusion on a site that warns criminals where ALPR cameras are located would be a boon to this effort. Convincing decoy camera housings, the subject of my post, might be enough to get the neighborhood listed without actually having go forward with a full Flock installation.
Let me be extremely clear: there's no member of the set of humans that actively avoid ALPR cameras that I want coming to my home uninvited. Not a single one.
This is part of the problem with Flock, IMO. Lack of adherence to or support of norms. Psychopathy actualized as a corporation.
The societal impact of disruption of trust, of personal privacy, is under-appreciated by the corporation. It's concerned with winning profit.
(Meta) It's an inspecific argument I'm lazily laying out, yes, however the problem is ridiculously obvious.
We should not have to ask to be respected, and here we are.
Democratic decline (both the systems and participation in), truth, self respect/understanding of one's own rights ... those qualities are dying at the relentless toxic, ethically under-explored capitalization of our laws and resources. (Especially USA, compare to corporate social responsibility countries, I suspect)
Tech disruption is amazing to watch, and participate in, like a fire consuming the forest. "But what about the children?"
Seems like a broad dismissal of the claim made upthread ("Flock sells its data to ICE and law enforcement"). Why do you think it is excessively specific?
Because the specific part here is selling the data and them doing it.
Flock does not sell data, they willingly give it away for free. And, technically, they don't do it - their customers do, and Flock knows and lets them.
Personally, in my view, this is worse. But they don't specifically sell data.
Being right on a technicality doesnt mean that everyone else is lying. We are not stupid, we were not born yesterday: we all understand that "selling data" does not literally mean exchanging money for data. It can also mean treating data haphazardly, or having a culture of extreme data collection. Both of which describe flock.
It's not a "technicality!" diogenes_atx made a very specific, false claim. Don't do that!
> we all understand that "selling data" does not literally mean exchanging money for data.
You're completely wrong there. That is exactly what it means.
If what you mean is lax security practices, or collecting data in general, just say that. There's really no need to bend over backwards to defend this.
I'm not bending over anywhere - I just disagree with you. Your opinion is not "blessed", it is just as much able to wrong as everyone elses.
They do not sell data, they willingly give it away for free - which is a form of selling data, with a price tag of $0.
Most reasonable human beings will actually say this is slightly worse than "selling" (literal) data. Therefore, I think most people would agree with me, and not with you.
In my mind it's very similar to claiming you're not a thief because you give away the stuff you take. No, you're still a thief, you just love being a thief so much you don't even do it for monetary gain. Which is... worse!
Why are you so opposed to CloudFlare? It's not perfect, but definitely better than Google and most ISP's... You might try an experiment to see if you are able to reach the archive.ph domain with CloudFlare, if only to see if DNS is the problem.
From what I understood, it's the torrent link that downloads a compromised zip file rather then the authentic image:
"Torrent downloads over at https://xubuntu.org/download/ are serving a zip file with a suspicious exe and a tos.txt inside. The TOS starts with Copyright (c) 2026 Xubuntu.org which is sus, because it is 2025. I opened the .exe with file-roller and couldn't find any .torrent inside."
Ah. Those work by having a valid zip at the end (and extraction code in front), taking advantage of the zip format allowing for arbitrary data before the actual zip data (which in turn was intended to facilitate this sort of thing).
It hadn't occurred to me that the .exe in question would be a self-extracting archive (or malicious code that also involves self-extracting an archive as part of the malicious working).
File roller does use 7z internally, so no real surprise here.
But both implementations can be vulnerable to malicious exe files, so it's not a great idea to do this with a file you already suspect to be malicious.
This url is on the main Xubuntu website, under "Xubuntu 24.04": click "Release page," then select United States. From there, you download the following files: SHA256SUMS, SHA256SUMS.gpg, xubuntu-24.04.3-desktop-amd64.iso
The output of the other checksum commands is shown here:
[user@host]$ gpg --keyid-format long --verify SHA256SUMS.gpg SHA256SUMS
gpg: Signature made Thu 07 Aug 2025 06:05:22 AM CDT
gpg: using RSA key 843938DF228D22F7B3742BC0D94AA3F0EFE21092
gpg: Can't check signature: No public key
[user@host]$ sha256sum --check SHA256SUMS
xubuntu-24.04.3-desktop-amd64.iso: OK
(output omitted for results of Xubuntu minimal version, which was not downloaded)
The checksum is a cryptographic hash generated from the ISO file's contents. While the checksum for a specific, unchanged ISO file is fixed, the checksum that is published on a website could be deliberately altered by an attacker to hide a modified, malicious ISO.
Generally speaking, a signature is cryptographically signed, when a checksum value is encrypted with the owners private key. The according public key should ideally be distributed in a chain-of-trust, so it can be obtained through a trusted channel.
Since the distro's site was compromised you also have to check that any keys it distributes haven't changed. And that the compromise wasn't done by a legitimate maintainer.
We are in a perpetual loop of inefficient check methods, a bunch of steps, rediscovering what a supply chain attack is, a bunch of steps and just loop back over again.
If an attacker can upload a compromised ISO I assume they can also upload a compromised checksum? In the age of https downloads — where the payload cannot be modified in transit — it never made sense to me why ISO checksums are a thing. For checksums to actually do anything there needs to be a chain of trust back to a trusted entity.
Lots of small, volunteer-run, low/zero-budget open-source projects cannot afford to pay for the server/CDN bandwidth they would need to host all their binary artifacts (ISOs, packages, etc.). They end up relying on mirrors provided for free by third parties instead. By publishing the checksums, they allow you to verify that the ISO image you downloaded from some mirror is the same one that they originally published.
TLS uses message authentication codes which should detect tampering or bit errors. In theory, a cosmic ray could hit the RAM of the device on the receiving end and bit flip after the ISO has been decrypted but still in RAM. Checksumming does not rule bit flips out though as you could checksum the ISO and the bitflip happens between then and when the ISO is actually used to install the system.
Maybe in theory you could checksum the post installed filesystem, but Im not sure if any distros actually do that or not and it would require deterministic install layouts.
It can fail mid-way of course, but it really shouldn't corrupt in any other way. HTTPS is authenticated after all and malicious manipulation is harder to defend against than accidental corruption. But this is reality and you can have bugs and errors outside the transport of course. Your file system could corrupt data, your drive could be bad etc.
For security you want signatures with a known, trusted key.
For those who are looking for information and analysis about Palantir, there is an academic study about surveillance technology with useful information about the company:
Sarah Brayne (2020) Predict and Surveil: Data, Discretion, and the Future of Policing, Oxford University Press.
As the book explains, Palanatir is one of the largest companies specializing in surveillance data management services for law enforcement, the military and other corporations. Palantir does not own its data but rather provides an interface that runs on top of other data systems, including legacy systems, making it possible to link data points across separate systems. Palantir gathers its data primarily from "data brokerage firms," including LexisNexis, Thomson Reuters CLEAR, Acxiom, CoreLogic, Cambridge Analytica, Datalogix, Epsilon, Accurint. As Brayne observes, these data brokerage firms "collect and aggregate information from public records and private sources, e.g., drivers licenses, mortgages, social media, retail loyalty card purchases, professional credentials, charities’ donor lists, bankruptcies, payday lenders, warranty registrations, wireless access points at hotels and retailers, phone service providers, Google searches and maps geolocation, and other sources who sell your data to customers willing to pay for it. Yet it is difficult to fully understand the scope of the data brokerage industry: even the FTC cannot find out exactly where the data brokers get their information because brokerages cite trade secrecy as an excuse to not divulge their sources."
Why is this a concern for people living in a democratic society with a legal system that supposedly protects individual freedoms? "Big data companies argue that their proprietary algorithms and data are trade secrets, and therefore they refuse to disclose their data, code and techniques with criminal defense attorneys or the public" (p. 135). This means that, "In many cases it is simply easier for law enforcement to purchase data from private firms than to rely on in-house data because there are fewer constitutional protections, reporting requirements and appellate checks on private sector surveillance and data collection, which enables police to circumvent privacy laws" (pp. 24-5, 41-2).
> This means that, "In many cases it is simply easier for law enforcement to purchase data from private firms than to rely on in-house data because there are fewer constitutional protections, reporting requirements and appellate checks on private sector surveillance and data collection, which enables police to circumvent privacy laws"
Another way to phrase this is:
Why transform government into Big Brother[0], with
all the hassle of oversight and accountability
this would entail, when outsourcing to Big Friends
via handsome contracts will achieve the same result
while enabling "plausible deniability" under oath?
Since then, I have moved to Austin, where there are two large Internet Service Providers competing against each other. I mostly use Spectrum; it is not perfect, people here complain about it, but it is incomparably better than Comcast.
reply