I have been building things in the open for long enough that it has become a conviction rather than a habit.

Open source code. Published datasets. Documented methodology. The kind of work where anyone, including the platforms and governments you are researching, can read exactly how you reached your conclusions. This is not naivety. It is, I think, the only approach that produces findings that cannot simply be dismissed.

When platforms cannot challenge your method because your method is public, they have to engage with your findings. When your dataset is available to anyone, other researchers can build on it, extend it, find errors in it, improve it. When your code is published, the work survives you. It does not live in a proprietary system that disappears when the funding does.

That is what InTheOpen is about. The work, the method, and the thinking behind both.

The work covers a specific territory. Information manipulation: how it spreads, what infrastructure it uses, how you find it before it has done its damage, and what it means for the platforms and institutions that are supposed to be accountable for it. I have spent several years on this: tracking propaganda networks in real time, auditing how platforms recommend content to real people in their homes, investigating scam ad operations, building training tools for analysts and journalists, contributing to policy frameworks.

None of that was planned. It started with a Discord server during COVID lockdown and kept going because the problems kept being dramatic and the tools kept being needed.

The posts here will be about the methodology as much as the findings.
Occasionally something else.

One thing I will not do is pretend that this work happens in a clean, well-funded, institutionally supported environment. It mostly does not. It happens at the edges, with limited resources, by people who care about the problem more than the career structure around it.

That is worth being honest about. The research community working on information manipulation is small, under-resourced relative to the threat, and doing serious work in difficult conditions. The operations it is trying to track are backed by state resources and adapt continuously. The platforms it is trying to hold accountable have legal teams and communications strategies and every incentive to question the methodology rather than engage with the findings.

Being accountable is part of the answer to all of that. It is harder to dismiss work you cannot find a hole in. It is harder to ignore findings that other researchers have independently verified. It is harder to bury data that is already public.

That is what InTheOpen means. Not transparency as a value statement.
Transparency as a working method.