# What does “secure” mean in Information Security?

This is text – written by Rikke Jensen and me – first appeared in the ISG Newsletter 2019/2020 under the title “What is Information Security?”. I’ve added a few links to this version.

The most fundamental task in information security is to establish what we mean by (information) security.

A possible answer to this question is given in countless LinkedIn posts, thought-leader blog entries and industry white papers: Confidentiality, Integrity, Availability. Since the vacuity of the “CIA Triad” is covered in the first lecture of the Security Management module of our MSc, we will assume our readers are familiar with it and will avoid this non-starter. Let us consider the matter more closely.

One subfield of information security that takes great care in tending to its definitions is cryptography. For example, Katz and Lindell write: “A key intellectual contribution of modern cryptography has been the recognition that formal definitions of security are an essential first step in the design of any cryptographic primitive or protocol”. Indeed, finding the correct security definition for a cryptographic primitive or protocol is a critical part of cryptographic work. That these definitions can be non-intuitive yet correct is made acutely apparent when asking students in class to come up with ideas of what it could mean for a block cipher to be secure. They never arrive at PRP security but propose security notions that are, well, broken.

Fine, we can grant cryptography that it knows how to define what a secure block cipher is. That is, we can know what is meant by it being secure, but does that imply that we are? Cryptographic security notions – and everything that depends on them – do not exist in a vacuum, they have reasons to be. While the immediate objects of cryptography are not social relations, it presumes and models them. This fact is readily acknowledged in the introductions of cryptographic papers where authors illustrate the utility of their proposed constructions by reference to some social situation where several parties have conflicting ends but a need or desire to interact. Yet, this part of the definitional work has not received the same rigour from the cryptographic community as complexity-theoretic and mathematical questions. For example, Goldreich writes: “The foundations of cryptography are the paradigms, approaches, and techniques used to conceptualize, define, and provide solutions to natural ‘security concerns’ ”. Following Blanchette we may ask back: “How does one identify such ‘natural security concerns’? On these questions, the literature remains silent”.

# Cryptographic Security Proofs as Dynamic Malware Analysis

This is text first appeared in the ISG Newsletter 2019/2020. I’ve added a bunch of links to this version.

# Open Letters v Surveillance

A letter, signed by 177 scientists and researchers working in the UK in the fields of information security and privacy, reads:

“Echoing the letter signed by 300 international leading researchers, we note that it is vital that, when we come out of the current crisis, we have not created a tool that enables data collection on the population, or on targeted sections of society, for surveillance. Thus, solutions which allow reconstructing invasive information about individuals must be fully justified.1 Such invasive information can include the ‘social graph’ of who someone has physically met over a period of time. With access to the social graph, a bad actor (state, private sector, or hacker) could spy on citizens’ real-world activities.”

Our letter2 stands in a tradition of similar letters and resolutions. For example, here is the “Copenhagen Resolution” of the International Association for Cryptologic Research (IACR), i.e. the professional body of cryptographers, from May 2014:

“The membership of the IACR repudiates mass surveillance and the undermining of cryptographic solutions and standards. Population-wide surveillance threatens democracy and human dignity. We call for expediting research and deployment of effective techniques to protect personal privacy against governmental and corporate overreach.”

Both of these documents, in line with similar documents, treat privacy and surveillance rather abstractly. On the one hand, this has merit: many of us, including myself, are experts on the technology and can speak to that with authority. Also, getting people working on privacy to agree that privacy is important is straight forward, getting them to agree on why is a much more difficult proposition. On the other hand, when we are making political interventions such as passing resolutions or writing open letters, I think we should also ask ourselves this question. I suspect we won’t all agree, but the added clarity might still be helpful.

Similar considerations have been voiced in several works before, e.g.

“The counter-surveillance movement is timely and deserves widespread support. However, as this article will argue and illustrate, raising the specter of an Orwellian system of mass surveillance, shifting the discussion to the technical domain, and couching that shift in economic terms undermine a political reading that would attend to the racial, gendered, classed, and colonial aspects of the surveillance programs. Our question is as follows: how can this specific discursive framing of counter-surveillance be re-politicized and broadened to enable a wider societal debate informed by the experiences of those subjected to targeted surveillance and associated state violence?” – Seda Gürses, Arun Kundnani & Joris Van Hoboken. Crypto and empire: the contradictions of counter-surveillance advocacy in Media, Culture & Society, 38(4), 576–590. 2016

and

“History teaches that extensive governmental surveillance becomes politicalin character. As civil-rights attorney Frank Donner and the Church Commission reports thoroughly document, domestic surveillance under U.S. FBI director J. Edgar Hoover served as a mechanism to protect the status quo and neutralize change movements. Very little of the FBI’s surveillance-related efforts were directed at law-enforcement: as the activities surveilled were rarely illegal, unwelcome behavior would result in sabotage, threats, blackmail, and inappropriate prosecutions, instead. For example, leveraging audio surveillance tapes, the FBI’s attempted to get Dr. Martin Luther King, Jr., to kill himself. U.S. universities were thoroughly infiltrated with informants: selected students, faculty, staff, and administrators would report to an extensive network of FBI handlers on anything political going on on campus. The surveillance of dissent became an institutional pillar for maintaining political order. The U.S. COINTELPRO program would run for more than 15 years, permanently reshaping the U.S. political landscape.” – Phillip Rogaway, The moral character of cryptographic work in Cryptology ePrint Archive. 2016

The pertinent question then is what surveillance is for. For an answer we have to look no further than GCHQ’s page about the Investigatory Powers Act 2016, which “predominantly governs” its mission:

“Before an interception warrant can be issued, the Secretary of State must believe that a warrant is necessary on certain, limited grounds, and that the interception is proportionate to what it seeks to achieve.

These grounds are that interception is necessary:

• In the interests of national security; or
• In the interests of the economic well-being of the UK; or
• In support of the prevention or detection of serious crime” – GCHQ. Investigatory Powers Act. 2019

To unpack what this means in detail, we can recall how and when the means of surveillance have been deployed in recent history, by GCHQ and other departments of the British State. A programme that might cover all three aspects was GCHQ’s Tempora programme which tapped into fibre-optic cables for secret access to the world’s communications. Similarly, hacking the SIM manufacturer Gemalto would also cover all three. Spying on the German government is probably one for the economic well-being of the UK. Similarly, the London Metropolitan Police working with construction firms to blacklist trade unionists would fall into that category. UK intelligence services spying on Privacy International, police officers infiltrating Greenpeace and other activist groups, and databases of activists are plausibly done in the name of national security. The stop and search policy tackles, amongst other things, the serious crime of walking while black.

When we write that “solutions which allow reconstructing invasive information about individuals must be fully justified” it is worth to pay close attention to the justifications offered and to ask: justified to whom and by what standard.

## Appendix: The letter in full

We, the undersigned, are scientists and researchers working in the UK in the fields of information security and privacy. We are concerned about plans by NHSX to deploy a contact tracing application. We urge that the health benefits of a digital solution be analysed in depth by specialists from all relevant academic disciplines, and sufficiently proven to be of value to justify the dangers involved.

A contact tracing application is a mobile phone application which records, using Bluetooth, the contacts between individuals, in order to detect a possible risk of infection. Such applications, by design, come with risks for privacy and medical confidentiality which can be mitigated more or less well, but not completely, depending on the approach taken in their design. We believe that any such application will only be used in the necessary numbers if it gives reason to be trusted by those being asked to install it.

It has been reported that NHSX is discussing an approach which records centrally the de-anonymised ID of someone who is infected and also the IDs of all those with whom the infected person has been in contact. This facility would enable (via mission creep) a form of surveillance. Echoing the letter signed by 300 international leading researchers, we note that it is vital that, when we come out of the current crisis, we have not created a tool that enables data collection on the population, or on targeted sections of society, for surveillance. Thus, solutions which allow reconstructing invasive information about individuals must be fully justified. Such invasive information can include the “social graph” of who someone has physically met over a period of time. With access to the social graph, a bad actor (state, private sector, or hacker) could spy on citizens’ real-world activities. We are particularly unnerved by a declaration that such a social graph is indeed aimed for by NHSX.

We understand that the current proposed design is intended to meet the requirements set out by the public health teams, but we have seen conflicting advice from different groups about how much data the public health teams need. We hold that the usual data protection principles should apply: collect the minimum data necessary to achieve the objective of the application. We hold it is vital that if you are to build the necessary trust in the application the level of data being collected is justified publicly by the public health teams demonstrating why this is truly necessary rather than simply the easiest way, or a “nice to have”, given the dangers involved and invasive nature of the technology.

We welcome the NHSX commitment to transparency, and in particular Matthew Gould’s commitment made to the Science & Technology committee on 28 April that the data protection impact assessment (DPIA) for the contact tracing application will be published. We are calling on NHSX to publish the DPIA immediately, rather than just before deployment, to enable (a) public debate about its implications and (b) public scrutiny of the security and privacy safeguards put in place.

We are also asking NHSX to, at a minimum, publicly commit that there will not be a database or databases, regardless of what controls are put in place, that would allow de-anonymization of users of its system, other than those self reporting as infected, to enable the data to be used for building, for example, social graphs.

Finally, we are asking NHSX how it plans to phase out the application after the pandemic has passed to prevent mission creep.

## Footnotes:

1

It is worth noting that this sentence does not, in fact, echo the international letter. Where the UK letter asks to justify such invasions, the international letter outright rejects them “without further discussion”. I think the international letter is better on this point.

2

I should note that I was involved in coordinating and drafting that letter and that I signed the “letter signed by 300 international leading researchers”.

# The Approximate GCD Problem

Steven Galbraith once told me that he expects mathematicians to teach RSA long after the world has migrated to post-quantum algorithms; because it is so easy to explain. Arguably, LWE is easier to explain than RSA but the Approximate Greatest Common Divisors problem (AGCD) is even easier than that and requires only scalars. Thus, it is a nice post-quantum alternative for an undergraduate mathematics module. Someone should perhaps write an undergraduate mathematics textbook introducing cryptography using Approximate Common Divisors.

# Lecturer/Assistant Professor in Cryptography in the ISG

Unfortunately, recruitment for this post was stopped due to the uncertain financial position that UK universities are in at the moment.

The ISG is recruiting a lecturer (≡ assistant professor in the US system, ≡ Juniorprofessor in the German system, ≡ Maître de conférences in the French system; that’s all the systems I know). This is a full-time, permanent research and teaching position.

Look, I know this is England post-Brexit but let me give you a personal pitch of why you should apply:

• It’s a big group. We got ~20 permanent members of staff working across the field of information security: cryptography, systems and social. Check out our seminar programme and our publications to get a sense of what is going on in the group.
• It’s a group with lots of cryptography going on. As mentioned in the ad below, eight permanent members of staff, five postdocs and about 15 PhD students focus on or contribute to cryptographic research. As a corollary, we also have plenty of cryptographers coming through for visits and talks. We got a weekly cryptography reading group, our students have another one and our seminar regularly has cryptography talks.
• It’s a group with a good mix of areas and lots of interaction. UK universities don’t work like German ones where professors have their little empires which don’t interact all too much. Rather, the hierarchies are pretty flat within a department (everybody is line managed by the Head of Department) which facilitates more interaction; at least within the ISG that’s true. For example, I’m currently working on a project with someone from the systems and software security lab and one of our social scientists. I doubt this sort of collaboration would have come about if we didn’t attend the same meetings, taught the same modules, went to lunch and the pub together etc. Interdisciplinarity from above is annoying, when it emerges spontaneously it can be great.
• It’s a nice group. People are genuinely friendly and we help each other out. It will be easy to find someone to proof read your grant applications or share previously successfully funded ones etc. I don’t know any official numbers but the unionisation level seems to be relatively high, which I also take as an indication that people don’t adopt a “everyone for themselves” approach.
• We got funding for our Centre for Doctoral Training for the next four years (then we have to reapply). This means 10 PhD positions per year. Also, our CDT attracts strong students. My research career really took off after getting a chance to work with our amazing students.
• The ISG is its own department (in a school with Physics, EE, Mathematics and Computer Science). All of our teaching is on information security with a focus on our Information Security MSc (which is huge). So you’ll get to teach information security. It is unlikely, though, that you will get to teach cryptography specifically.
• The ISG has strong industry links. Thus, if that’s your cup of tea, it will be easy to get introductions etc. A side effect of these strong links is that consulting opportunities tend to pop up. Consulting is not only permitted by the employer but encouraged (they take a cut if you do it through them).
• The ISG is a large group but Royal Holloway is a relatively small university. That means getting things done by speaking to the person in charge is often possible, i.e. it’s not some massive bureaucracy and exceptions can be negotiated.
• It’s within one standard deviation from London. This means UCL and Surrey, and thus the cryptographers there, aren’t too far away. London Crypto Day is a thing and so are the London-ish Lattice Coding & Crypto Meetings. Also, you get to live in London (or near Egham if that’s your thing, no judgement).

I’m happy to answer informal inquiries etc. We’d appreciate any help in spreading the word.

# 10 PhD Positions at Royal Holloway’s Centre for Doctoral Training in Cyber Security for the Everyday

At Royal Holloway we are again taking applications for ten fully-funded PhD positions in Information Security. See the CDT website and the ISG website for what kind of research we do. Also, check out our past and current CDT students and our research seminar schedule to get an idea of how broad and diverse the areas of information security are in which the ISG works.

More narrowly, to give you some idea of cryptographic research (and thus supervision capacity) in the ISG/at Royal Holloway: currently, there are nine permanent members of staff working on cryptography: Simon Blackburn (Maths), Carlos Cid, Keith Martin, Sean Murphy, Siaw-Lynn Ng, Rachel Player, Liz Quaglia and me. In addition, there are five postdocs working on cryptography and roughly 15 PhD students. Focus areas of cryptographic research currently are: lattice-based cryptography and applications, post-quantum cryptography, symmetric cryptography, statistics, access control, information-theoretic security and protocols.

Note that most of these positions are reserved for UK residents, which does, however, not mean nationality (see CDT website for details) and there might also be some wiggle room for EU residents (yes, still!).

# 17th IMA Conference on Cryptography and Coding

IMA-CC is a crypto and coding theory conference biennially held in the UK. It was previously held in Cirencester. So you might have heard of it as the “Cirncester” conference. However, it has been moved to Oxford, so calling it Cirencester now is a bit confusing. Anyway, it is happening again this year. IMA is a small but fine conference with the added perk of being right before Christmas. This is great because around that time of the year Oxford is a fairly Christmas-y place to be.

16 – 18 December 2019, St Anne’s College, University of Oxford

# Postdoc Position at Royal Holloway on Key Exchange

Carlos and I have a postdoc position on designing cryptographic key exchange protocols that support incorporating key material from, erm, … diverse sources. This is part of a consortium that looks at integrating some quantum cryptography with post-quantum cryptography, but there is no need to think so narrowly about the problem. That is, the project is about incorporating randomness from wherever it might come and what security goals can be achieved depending on what is compromised. More generally, if you enjoy cryptographic protocols, not limited to key exchange protocols, this might be a fitting postdoc position. Get in touch with Carlos or me, if you’re unsure on whether the position is a good fit.

 Location: Egham Salary: £39,479 to £41,743 per annum – including London Allowance Closing Date: Tuesday 12 March 2019 Interview Date: To be confirmed Reference: 0219-048

The Information Security Group at Royal Holloway University of London is seeking to recruit a postdoctoral research assistant (PDRA) to work in the area of cryptography. The position is available for immediate start, for up to 26 months (until 31 March 2021).

The PDRA will work alongside Prof. Carlos Cid, Dr. Martin Albrecht and other cryptographic researchers at Royal Holloway on topics connected to the design and analysis of cryptographic key exchange protocols that support incorporating key material from diverse sources. This post is part of the AQuaSec project, a Innovate UK-funded research project with 17 partners from industry and academia, aiming to develop technologies for quantum-safe communications by integrating post-quantum cryptography with techniques from quantum cryptography.

Applicants for this role should have already completed, or be close to completing, a PhD in a relevant discipline, with an outstanding research track record in cryptography. Applicants should be able to demonstrate scientific creativity, research independence, and the ability to communicate their ideas effectively in written and verbal form. Salary is £39,479 per annum, inclusive of London Allowance. This post is appointed at Grade 7, Spine point 34.

Established in 1990, the Information Security Group at Royal Holloway was one of the first dedicated academic groups in the world to conduct research and teaching in information security. The ISG is today a world-leading interdisciplinary research group with 20 full-time members of staff, several postdoctoral research assistants and over 50 PhD students working on a range of subjects in cyber security, in particular cryptography.

In return we offer a highly competitive rewards and benefits package including:

• Generous annual leave entitlement
• Training and Development opportunities
• Pension Scheme with generous employer contribution
• Various schemes including Cycle to Work, Season Ticket Loans and help with the cost of Eyesight testing.
• Free parking
• Competitive Maternity, Adoption and Shared Parental Leave provisions

The post is based in Egham, Surrey where the College is situated in a beautiful, leafy campus near to Windsor Great Park and within commuting distance from London.

To view further details of this post and to apply please visit https://jobs.royalholloway.ac.uk. For queries on the application process the Human Resources Department can be contacted by email at: recruitment@rhul.ac.uk. Informal enquiries can be made to Prof. Carlos Cid at carlos.cid@rhul.ac.uk.

Closing Date: Midnight, 12 March 2019

Interview Date: To be confirmed

PS: I will have two more postdoc positions, on lattice-based cryptography in the next few weeks/months.

# 10 PhD Positions at Royal Holloway’s Centre for Doctoral Training in Cyber Security

At Royal Holloway we are now taking applications for ten fully-funded PhD positions in Information Security. See the CDT website and the ISG website for what kind of research we do. In particular, check out our past and current CDT students to get an idea of how broad and diverse the areas of information security are in which they work.

Note that most of these positions are reserved for UK residents, which does, however, not mean nationality (see CDT website for details) and there might also be some wiggle room for EU residents.

# NTT Considered Harmful?

In a typical Ring-LWE-based public-key encryption scheme, Alice publishes

$(a, b) = (a, a \cdot s + e) \in \mathbb{Z}_q[x]/(x^n+1)$

(with $n$ a power of two1) as the public key, where $s, e$ are both “small” and secret. To encrypt, Bob computes

$(c_{0}, c_{1}) = (v \cdot a + e', v \cdot b + e'' + \textnormal{Encode}(m))$

where $v, e', e''$ are small, $m$ is the message $\in \{0,1\}^n$ and $\textnormal{Encode}(\cdot)$ some encoding function, e.g. $\sum_{i=0}^{n-1} \lfloor \frac{q}{2} \rfloor m_i x^i$ . To decrypt, Alice computes

$c_{0} \cdot s - c_{1} = (v \cdot a + e')\cdot s - v \cdot (a\cdot s + e) + e'' + \textnormal{Encode}(m),$

which is equal to $e' \cdot s - v \cdot e + e'' + \textnormal{Encode}(m)$. Finally, Alice recovers $m$ from the noisy encoding of $m$ where $e' \cdot s - v \cdot e + e''$ is the noise. In the Module-LWE variant the elements essentially live in $\left(\mathbb{Z}_q[x]/(x^n+1)\right)^k$, e.g. $a$ is not a polynomial but a vector of polynomials.

Thus, both encryption and decryption involve polynomial multiplication modulo $x^n+1$. Using schoolbook multiplication this costs $\mathcal{O}(n^2)$ operations. However, when selecting parameters for Ring-LWE, we can choose $q \equiv 1 \bmod 2n$ which permits to use an NTT to realise this multiplication (we require $\equiv \bmod 2n$ to use the negacyclic NTT which has modular reductions modulo $x^n+1$ baked in). Then, using the NTT we can implement multiplication by

1. evaluation (perform NTT),
2. pointwise multiplication,
3. interpolation (perform inverse NTT).

Steps (1) and (3) take $\mathcal{O}(n \log n)$ operations by using specially chosen evaluation points (roots of one). Step (2) costs $\mathcal{O}(n)$ operations.

This is trick is very popular. For example, many (but not all!) Ring-LWE based schemes submitted to the NIST PQC competition process use it, namely NewHope, LIMA (go LIMA!), LAC, KCL, HILA5, R.EMBLEM, Ding Key-Exchange, CRYSTALS-KYBER, CRYSTALS-DILITHIUM (sorry, if I forgot one). Note that since steps (1) and (3) are the expensive steps, it makes sense to remain in the NTT domain (i.e. after applying the NTT) and only to convert back at the very end. For example, it is faster for Alice to store $s, e$ in NTT domain and, since the NTT maps uniform to uniform, to sample $a$ in NTT domain directly, i.e. to just assume that a random vector $a$ is already the output of an NTT on some other random vector.

This post is about two recent results I was involved in suggesting that this is not necessarily always the best choice (depending on your priorities.)

Warning: This is going to be one of those clickbait-y pieces where the article doesn’t live up to the promise in the headline. The NTT is fine. Some of my best friends use the NTT. In fact I’ve implemented and used the NTT myself.