# Mesh Messaging in Large-scale Protests: Breaking Bridgefy

Together with Jorge Blasco, Rikke Bjerg Jensen and Lenka Marekova we have studied the security of the Bridgefy mesh messaging application. This work was motivated by (social) media reports that this application was or is used by participants in large-scale protests in anticipation of or in response to government-mandated Internet shutdowns (or simply because the network infrastructure cannot handle as many devices at the same time as there are during such large protests). The first reports were about Hong Kong, later reports were then about India, Iran, US, Zimbabwe, Belarus and Thailand (typically referencing Hong Kong as an inspiration). In such a situation, mesh networking seems promising: a network is spanned between participants’ phones to create an ad-hoc local network to route messages.

Now, Bridgefy wasn’t designed with this use-case in mind. Rather, its designers had large sports events or natural disasters in mind. Leaving aside the discussion here if those use-cases too warrant a secure-by-default design, we had reason to suspect that the security offered by Bridgefy might not match the expectation of those who might rely on it.

Indeed, we found a series of vulnerabilities in Bridgefy. Our results show that Bridgefy currently permits its users to be tracked, offers no authenticity, no effective confidentiality protections and lacks resilience against adversarially crafted messages. We verify these vulnerabilities by demonstrating a series of practical attacks on Bridgefy. Thus, if protesters rely on Bridgefy, an adversary can produce social graphs about them, read their messages, impersonate anyone to anyone and shut down the entire network with a single maliciously crafted message. For a good overview, see Dan Goodin’s article on our work at Ars Technica.

We disclosed these vulnerabilities to the Bridgefy developers in April 2020 and agreed on a public disclosure date of 20 August 2020. Starting from 1 June 2020, the Bridgefy team began warning their users that they should not expect confidentiality guarantees from the current version of the application.

Let me stress, however, that, as of 24 August, Bridgefy has not been patched to fix these vulnerabilities and thus that these vulnerabilities are present in the currently deployed version. The developers are currently implementing/testing a switch to the Signal protocol to provide cryptographic assurances in their SDK. This switch, if done correctly, would rule out many of the attacks described in our work. They hope to have this fix deployed soon.

# Conda, Jupyter and Emacs

Jupyter is great. Yet, I find myself missing all the little tweaks I made to Emacs whenever I have Jupyter open in a browser. The obvious solution is to have Jupyter in Emacs. One solution is EIN, the Emacs IPython Notebook. However, I had a mixed experience with it: it would often hang and eat up memory (I never bothered to try to debug this behaviour). A neat alternative, for me, is emacs-jupyter. Here’s my setup.

# What does “secure” mean in Information Security?

This is text – written by Rikke Jensen and me – first appeared in the ISG Newsletter 2019/2020 under the title “What is Information Security?”. I’ve added a few links to this version.

The most fundamental task in information security is to establish what we mean by (information) security.

A possible answer to this question is given in countless LinkedIn posts, thought-leader blog entries and industry white papers: Confidentiality, Integrity, Availability. Since the vacuity of the “CIA Triad” is covered in the first lecture of the Security Management module of our MSc, we will assume our readers are familiar with it and will avoid this non-starter. Let us consider the matter more closely.

One subfield of information security that takes great care in tending to its definitions is cryptography. For example, Katz and Lindell write: “A key intellectual contribution of modern cryptography has been the recognition that formal definitions of security are an essential first step in the design of any cryptographic primitive or protocol”. Indeed, finding the correct security definition for a cryptographic primitive or protocol is a critical part of cryptographic work. That these definitions can be non-intuitive yet correct is made acutely apparent when asking students in class to come up with ideas of what it could mean for a block cipher to be secure. They never arrive at PRP security but propose security notions that are, well, broken.

Fine, we can grant cryptography that it knows how to define what a secure block cipher is. That is, we can know what is meant by it being secure, but does that imply that we are? Cryptographic security notions – and everything that depends on them – do not exist in a vacuum, they have reasons to be. While the immediate objects of cryptography are not social relations, it presumes and models them. This fact is readily acknowledged in the introductions of cryptographic papers where authors illustrate the utility of their proposed constructions by reference to some social situation where several parties have conflicting ends but a need or desire to interact. Yet, this part of the definitional work has not received the same rigour from the cryptographic community as complexity-theoretic and mathematical questions. For example, Goldreich writes: “The foundations of cryptography are the paradigms, approaches, and techniques used to conceptualize, define, and provide solutions to natural ‘security concerns’ ”. Following Blanchette we may ask back: “How does one identify such ‘natural security concerns’? On these questions, the literature remains silent”.

# Cryptographic Security Proofs as Dynamic Malware Analysis

This is text first appeared in the ISG Newsletter 2019/2020. I’ve added a bunch of links to this version.

# Postdoc at Royal Holloway on Lattice-based Cryptography

Update: 25/09/2020: New deadline: 30 October.

We are looking for a postdoc to join us to work on lattice-based cryptography. This postdoc is funded by the EU H2020 PROMETHEUS project for building privacy preserving systems from advanced lattice primitives. At Royal Holloway, the project is looked after by Rachel Player and me. Feel free to e-mail me with any queries you might have.

The ISG is a nice place to work; it’s a very friendly environment with strong research going on in several areas. We got people working across the field of information security including several people working on cryptography. A postdoc here is a 100% research position, i.e. you wouldn’t have teaching duties. That said, if you’d like to gain some teaching experience, we can arrange for that as well.

Also, if you have e.g. a two-body problem and would like to discuss flexibility about being in the office (assuming we’ll all be back in the office at some post-covid19 point), feel free to get in touch.

# SSH++

My research often has a computational component which means logging into one of my servers, kicking off a long running computation, waiting a few days and recovering the output. Here’s how I, inspired by Filippo Valsorda’s post, addressed some of the pain points with this sort thing.

# Faster Enumeration-based Lattice Reduction

Our paper “Faster Enumeration-based Lattice Reduction: Root Hermite Factor $k^{1/(2k)}$ in Time $k^{k/8\, +\, o(k)}$” – together with Shi Bai, Pierre-Alain Fouque, Paul Kirchner, Damien Stehlé and Weiqiang Wen – is now available on ePrint (the work has been accepted to CRYPTO 2020). Here’s the abstract:

We give a lattice reduction algorithm that achieves root Hermite factor $k^{1/(2k)}$ in time $k^{k/8 + o(k)}$ and polynomial memory. This improves on the previously best known enumeration-based algorithms which achieve the same quality, but in time $k^{k/(2e) + o(k)}$. A cost of $k^{k/8 + o(k)}$ was previously mentioned as potentially achievable (Hanrot-Stehlé’10) or as a heuristic lower bound (Nguyen’10) for enumeration algorithms. We prove the complexity and quality of our algorithm under a heuristic assumption and provide empirical evidence from simulation and implementation experiments attesting to its performance for practical and cryptographic parameter sizes. Our work also suggests potential avenues for achieving costs below $k^{k/8 + o(k)}$ for the same root Hermite factor, based on the geometry of SDBKZ-reduced bases.

# Open Letters v Surveillance

A letter, signed by 177 scientists and researchers working in the UK in the fields of information security and privacy, reads:

“Echoing the letter signed by 300 international leading researchers, we note that it is vital that, when we come out of the current crisis, we have not created a tool that enables data collection on the population, or on targeted sections of society, for surveillance. Thus, solutions which allow reconstructing invasive information about individuals must be fully justified.1 Such invasive information can include the ‘social graph’ of who someone has physically met over a period of time. With access to the social graph, a bad actor (state, private sector, or hacker) could spy on citizens’ real-world activities.”

Our letter2 stands in a tradition of similar letters and resolutions. For example, here is the “Copenhagen Resolution” of the International Association for Cryptologic Research (IACR), i.e. the professional body of cryptographers, from May 2014:

“The membership of the IACR repudiates mass surveillance and the undermining of cryptographic solutions and standards. Population-wide surveillance threatens democracy and human dignity. We call for expediting research and deployment of effective techniques to protect personal privacy against governmental and corporate overreach.”

Both of these documents, in line with similar documents, treat privacy and surveillance rather abstractly. On the one hand, this has merit: many of us, including myself, are experts on the technology and can speak to that with authority. Also, getting people working on privacy to agree that privacy is important is straight forward, getting them to agree on why is a much more difficult proposition. On the other hand, when we are making political interventions such as passing resolutions or writing open letters, I think we should also ask ourselves this question. I suspect we won’t all agree, but the added clarity might still be helpful.

Similar considerations have been voiced in several works before, e.g.

“The counter-surveillance movement is timely and deserves widespread support. However, as this article will argue and illustrate, raising the specter of an Orwellian system of mass surveillance, shifting the discussion to the technical domain, and couching that shift in economic terms undermine a political reading that would attend to the racial, gendered, classed, and colonial aspects of the surveillance programs. Our question is as follows: how can this specific discursive framing of counter-surveillance be re-politicized and broadened to enable a wider societal debate informed by the experiences of those subjected to targeted surveillance and associated state violence?” – Seda Gürses, Arun Kundnani & Joris Van Hoboken. Crypto and empire: the contradictions of counter-surveillance advocacy in Media, Culture & Society, 38(4), 576–590. 2016

and

“History teaches that extensive governmental surveillance becomes politicalin character. As civil-rights attorney Frank Donner and the Church Commission reports thoroughly document, domestic surveillance under U.S. FBI director J. Edgar Hoover served as a mechanism to protect the status quo and neutralize change movements. Very little of the FBI’s surveillance-related efforts were directed at law-enforcement: as the activities surveilled were rarely illegal, unwelcome behavior would result in sabotage, threats, blackmail, and inappropriate prosecutions, instead. For example, leveraging audio surveillance tapes, the FBI’s attempted to get Dr. Martin Luther King, Jr., to kill himself. U.S. universities were thoroughly infiltrated with informants: selected students, faculty, staff, and administrators would report to an extensive network of FBI handlers on anything political going on on campus. The surveillance of dissent became an institutional pillar for maintaining political order. The U.S. COINTELPRO program would run for more than 15 years, permanently reshaping the U.S. political landscape.” – Phillip Rogaway, The moral character of cryptographic work in Cryptology ePrint Archive. 2016

The pertinent question then is what surveillance is for. For an answer we have to look no further than GCHQ’s page about the Investigatory Powers Act 2016, which “predominantly governs” its mission:

“Before an interception warrant can be issued, the Secretary of State must believe that a warrant is necessary on certain, limited grounds, and that the interception is proportionate to what it seeks to achieve.

These grounds are that interception is necessary:

• In the interests of national security; or
• In the interests of the economic well-being of the UK; or
• In support of the prevention or detection of serious crime” – GCHQ. Investigatory Powers Act. 2019

To unpack what this means in detail, we can recall how and when the means of surveillance have been deployed in recent history, by GCHQ and other departments of the British State. A programme that might cover all three aspects was GCHQ’s Tempora programme which tapped into fibre-optic cables for secret access to the world’s communications. Similarly, hacking the SIM manufacturer Gemalto would also cover all three. Spying on the German government is probably one for the economic well-being of the UK. Similarly, the London Metropolitan Police working with construction firms to blacklist trade unionists would fall into that category. UK intelligence services spying on Privacy International, police officers infiltrating Greenpeace and other activist groups, and databases of activists are plausibly done in the name of national security. The stop and search policy tackles, amongst other things, the serious crime of walking while black.

When we write that “solutions which allow reconstructing invasive information about individuals must be fully justified” it is worth to pay close attention to the justifications offered and to ask: justified to whom and by what standard.

## Appendix: The letter in full

We, the undersigned, are scientists and researchers working in the UK in the fields of information security and privacy. We are concerned about plans by NHSX to deploy a contact tracing application. We urge that the health benefits of a digital solution be analysed in depth by specialists from all relevant academic disciplines, and sufficiently proven to be of value to justify the dangers involved.

A contact tracing application is a mobile phone application which records, using Bluetooth, the contacts between individuals, in order to detect a possible risk of infection. Such applications, by design, come with risks for privacy and medical confidentiality which can be mitigated more or less well, but not completely, depending on the approach taken in their design. We believe that any such application will only be used in the necessary numbers if it gives reason to be trusted by those being asked to install it.

It has been reported that NHSX is discussing an approach which records centrally the de-anonymised ID of someone who is infected and also the IDs of all those with whom the infected person has been in contact. This facility would enable (via mission creep) a form of surveillance. Echoing the letter signed by 300 international leading researchers, we note that it is vital that, when we come out of the current crisis, we have not created a tool that enables data collection on the population, or on targeted sections of society, for surveillance. Thus, solutions which allow reconstructing invasive information about individuals must be fully justified. Such invasive information can include the “social graph” of who someone has physically met over a period of time. With access to the social graph, a bad actor (state, private sector, or hacker) could spy on citizens’ real-world activities. We are particularly unnerved by a declaration that such a social graph is indeed aimed for by NHSX.

We understand that the current proposed design is intended to meet the requirements set out by the public health teams, but we have seen conflicting advice from different groups about how much data the public health teams need. We hold that the usual data protection principles should apply: collect the minimum data necessary to achieve the objective of the application. We hold it is vital that if you are to build the necessary trust in the application the level of data being collected is justified publicly by the public health teams demonstrating why this is truly necessary rather than simply the easiest way, or a “nice to have”, given the dangers involved and invasive nature of the technology.

We welcome the NHSX commitment to transparency, and in particular Matthew Gould’s commitment made to the Science & Technology committee on 28 April that the data protection impact assessment (DPIA) for the contact tracing application will be published. We are calling on NHSX to publish the DPIA immediately, rather than just before deployment, to enable (a) public debate about its implications and (b) public scrutiny of the security and privacy safeguards put in place.

We are also asking NHSX to, at a minimum, publicly commit that there will not be a database or databases, regardless of what controls are put in place, that would allow de-anonymization of users of its system, other than those self reporting as infected, to enable the data to be used for building, for example, social graphs.

Finally, we are asking NHSX how it plans to phase out the application after the pandemic has passed to prevent mission creep.

## Footnotes:

1

It is worth noting that this sentence does not, in fact, echo the international letter. Where the UK letter asks to justify such invasions, the international letter outright rejects them “without further discussion”. I think the international letter is better on this point.

2

I should note that I was involved in coordinating and drafting that letter and that I signed the “letter signed by 300 international leading researchers”.

# The Approximate GCD Problem

Steven Galbraith once told me that he expects mathematicians to teach RSA long after the world has migrated to post-quantum algorithms; because it is so easy to explain. Arguably, LWE is easier to explain than RSA but the Approximate Greatest Common Divisors problem (AGCD) is even easier than that and requires only scalars. Thus, it is a nice post-quantum alternative for an undergraduate mathematics module. Someone should perhaps write an undergraduate mathematics textbook introducing cryptography using Approximate Common Divisors.

# Lecturer/Assistant Professor in Cryptography in the ISG

Unfortunately, recruitment for this post was stopped due to the uncertain financial position that UK universities are in at the moment.

The ISG is recruiting a lecturer (≡ assistant professor in the US system, ≡ Juniorprofessor in the German system, ≡ Maître de conférences in the French system; that’s all the systems I know). This is a full-time, permanent research and teaching position.

Look, I know this is England post-Brexit but let me give you a personal pitch of why you should apply:

• It’s a big group. We got ~20 permanent members of staff working across the field of information security: cryptography, systems and social. Check out our seminar programme and our publications to get a sense of what is going on in the group.
• It’s a group with lots of cryptography going on. As mentioned in the ad below, eight permanent members of staff, five postdocs and about 15 PhD students focus on or contribute to cryptographic research. As a corollary, we also have plenty of cryptographers coming through for visits and talks. We got a weekly cryptography reading group, our students have another one and our seminar regularly has cryptography talks.
• It’s a group with a good mix of areas and lots of interaction. UK universities don’t work like German ones where professors have their little empires which don’t interact all too much. Rather, the hierarchies are pretty flat within a department (everybody is line managed by the Head of Department) which facilitates more interaction; at least within the ISG that’s true. For example, I’m currently working on a project with someone from the systems and software security lab and one of our social scientists. I doubt this sort of collaboration would have come about if we didn’t attend the same meetings, taught the same modules, went to lunch and the pub together etc. Interdisciplinarity from above is annoying, when it emerges spontaneously it can be great.
• It’s a nice group. People are genuinely friendly and we help each other out. It will be easy to find someone to proof read your grant applications or share previously successfully funded ones etc. I don’t know any official numbers but the unionisation level seems to be relatively high, which I also take as an indication that people don’t adopt a “everyone for themselves” approach.
• We got funding for our Centre for Doctoral Training for the next four years (then we have to reapply). This means 10 PhD positions per year. Also, our CDT attracts strong students. My research career really took off after getting a chance to work with our amazing students.
• The ISG is its own department (in a school with Physics, EE, Mathematics and Computer Science). All of our teaching is on information security with a focus on our Information Security MSc (which is huge). So you’ll get to teach information security. It is unlikely, though, that you will get to teach cryptography specifically.
• The ISG has strong industry links. Thus, if that’s your cup of tea, it will be easy to get introductions etc. A side effect of these strong links is that consulting opportunities tend to pop up. Consulting is not only permitted by the employer but encouraged (they take a cut if you do it through them).
• The ISG is a large group but Royal Holloway is a relatively small university. That means getting things done by speaking to the person in charge is often possible, i.e. it’s not some massive bureaucracy and exceptions can be negotiated.
• It’s within one standard deviation from London. This means UCL and Surrey, and thus the cryptographers there, aren’t too far away. London Crypto Day is a thing and so are the London-ish Lattice Coding & Crypto Meetings. Also, you get to live in London (or near Egham if that’s your thing, no judgement).

I’m happy to answer informal inquiries etc. We’d appreciate any help in spreading the word.