'Concealing for Freedom: The Making of Encryption, Secure Messaging and Digital Liberties' Book Launch
12:22PM Jun 7, +0000
Speakers:
Andrea Medrado
Jo Pierson
Francesca Musiani
Ksenia Ermoshina
Sandra Braman
Keywords:
book
encryption
francesca
users
secure messaging
telegram
based
privacy
project
architecture
tools
apps
communication
developers
question
protocol
mentioned
messaging apps
technology
data
Okay, we have 13 people in the room now we wait couple more minutes Hi Martine Yeah, hopefully okay, we joined up good good. Nice to see you. You too. I just wait one more minute and then I can start do the quick introduction
Okay, well it's five past two. So I think we can start. Welcome everyone the usual zoom greetings. Good morning. Good afternoon. Good evening. I'm Andrea Medrado, currently vice president of iamcr and a lecturer at the School of Media and Communication of the University of Westminster, in London. And you are all very welcome to the book launch concealing for freedom, making use of encryption, secure messaging and digital liberties by Francesca Siani Zania emotional part of the iamcr webinar series. These are events in partnerships with iamcr sections and working groups to foster engagement in occasions that are outside of the annual iamcr conferences. And this event is being sponsored by the communication policy and technology CPT section of iamcr So I'm honored to be here representing the association during the launch of this important title. The first thing that's sociological inquiry in the field of secure communication, and I wanted to thank Of course, Francesca and Xena, authors of the book Francesca is also co chair of the CPT section and then a former co chair of the Emerging Scholars Network section. Thank you also to Chris Gerard and his team for all the work organizing this event making this event happen. And we will have the usual webinar recording in progress all the microphones muted during the presentations, but participants are of course welcome to interact via the chat box. And there will be a q&a moment after the presentation. Anyone can interact either by typing in the chat or muting themselves in speaking just so you know, we are recording this webinar and it's going to be available on the IMC our webinars web page afterwards. And we are also live streaming it via ISOC dot life and the link is in the chat if I'm not taken so I will now introduce Joel Pearson who will act as our moderator and our our discussions today. He is a professor of responsible digitalization at free university, Brussels and our School of Social scientists sciences sorry at Hasselt. University in Belgium. He is also a former chair of the CPT section. So Joe, this screen is all yours. Thank you.
Thank you very much for the nice introduction. So as announced I was asked to act as moderator but also as discussing for this very interesting webinar on the book, launch of concealing for freedom, the making of encryption, secure messaging, and digital liberties.
So besides moderating the event, I will also at the end, give some comments and feedback, some questions possibly also already on the book, and based on also the presentation that our two presenters will give. So that's the great occasion for to introduce two speakers to you to authors of the book, first of all senior machina and, and the other author being Francesca Luciana. They are both tenured researchers at the French National Center for Scientific Research CNS, they're also they are based at the Center for Internet and Society which Francesca co founded and CO directs from 2016 to 2018. Both of them worked on the age to 2020 project next deep Max, which stands for next generation, techno social and legal encryption, access and privacy. And as you will hear, obviously, also in the presentation, but also mentioned the book. This project has been very important for establishing the research that came out of this and late and resulted in the book being presented today. The research of both of the author's exploits the Internet infrastructures and architectures as tools for governance and resistance in today's digital world. Let's for now give the floor to SR and Francesca to start the presentation.
To thank you,
thank you very much. Thank you to iamcr to to Andrea for spearheading this to your for accepting the moderation and discussion of this. And to all of you who who took the time out to be here today. And I also really wish to thank Julie from Internet Society world who can be picked up on this on this webinar and Isaac both at the level at the global level and and in some of its national chapters has a strong interest in encryption so we are very grateful that this has been a simulcast and live stream the meantime I hope I haven't forgotten anyone Well, I forgot to explicitly mentioned Bruce's you know who supervises this and many other activities. Oh by MCR with the same attention and patience. So thanks a lot bricks for this and so the time with you to say a few things about about this book and I am going to share a few a few slides for you to do so. So this is the the cover of the book that has to do with the chameleons but also with ASCII code with with messaging more or less secure. So the book as a as it has been mentioned, is called concealing for freedom. And the the key words around it are really on one hand encryption on the other hand, secure messaging and the third digital Liberty digital liberties with the idea of exploring the crossroads between these three, these three phenomena and we will be saying many more things about this in the coming in the coming minutes with Sonia, who is also was also here and will be speaking in two instances in the in the presentation as I will. So the book is has been published by mattering press which I would like to to give a bit heads up and publicity about it is an open access researcher led by a publisher that that is based in the UK in the UK. Incidentally this has meant that we haven't yet received physical copies of the book that are stuck in some customs limbo between UK and France because of Brexit has had no difference whatsoever and our lives right. But yeah, so we are expecting patiently and the good thing in the meantime is that the book is open access and so if you feel like buying a paper copy and accept the risk that at some point is going to show up in your mailbox. When you do when it goes through customs please do because it will help matter in press two for also co funding other open access projects but it is already readily accessible and available on the Internet and on the mattering press website. So so as as soon as I started to mention, this book comes out of over specific contexts of research project and research funding, which is a meaningful in a in many, many respects for how the the shape that the book ended up taking and how how we have presented our research results. So this this feature research has been conducted in the frame of this European age 2020 the project and next generation techno social and legal encryption access and privacy are next week, which was which ran from 2016 to 2018. And so we had three years of funding. And so this project is about also very particular historical context that I will be saying a little bit more about in the next slide. But so far, I will just say that something very important has happened for full privacy, consciousness, self consciousness of users about privacy and
well knowledge about a number of techniques that are available or that a cooler should be available to users to protect their their communications. And this This phenomenon has been the revelations by Edward Snowden about the system of mass surveillance that was spearheaded by Nick National Security Agency and all the different alliances and and liaisons that it revealed between governmental agencies and private sector actors. In in order to to exert surveillance on specific populations or on the population at large. And so, this project and I'm sure quite a few others. In that period of time that immediately followed. The Snowden revelation was funded with the with the idea of B believe that research, interdisciplinary research about privacy could serve to build things that could do something about what the Snowden revelations had brought to the fore and could help good research that merged social sciences and and hard sciences in computer science person for most could really help on on a daily basis, and very concrete one, citizens to improve the their secure communications. And so, we did this next clip was was came, came to our minds the idea for this project and it was subsequently funded with the idea of developing an interdisciplinary science of decentralization merged with privacy that could provide a basis on which to build or help building new protocols for for secure
communications. And here is a part
of the team including MCS and is not there yet because she was a hired a little bit afterwards on on the project as a postdoc. And so, really, this this project owes a lot to this particular historical context. That they came out with the Snowden revelations. So I would like to to mention before we go into the, into the core content of the book that we also had the pleasure and the honor of having a professor lower than artist from American University, with whom I have published a couple of volumes already. We had the pleasure to help her to write a preface forward for our book, and she decided to call it the political life of encryption. And in in this in this preface, she goes beyond the word which is a more specific focus for for our, for our book, which is the field of secure communication, to speak about the wide variety of contexts in which encryption is the most politically charged of all Internet technologies as she describes, and she emphasizes that encryption is really a site of contestation and the tension between sets of competing values and and as such, has has seen, its life or its most multiplicity of lives. crossed by by multiple controversies that that have technical components, but also economic and social and ng reticle. So, this book, is, as I explained yesterday,
I you know, that we share points,
so if you didn't remember if it was this one or the following
yes, we're just gonna just set encryption is a political thing, not just technical, and it's on the crossroads of many other sub disciplines and not only cryptography, but also usability, which has become important for us as STS scholars. Why I will try to explain because this book, actually in itself is a hybrid object. As it results from a collaboration with a team of computer scientists. So we were surrounded next level by cryptographers who do fundamental research and crypto, who are teaching in best schools of the world and who are innovating. Fundamentally. For example, one of our colleagues, Jordan is now working on post quantum cryptography. So he is kind of in the avant garde of the making of encryption protocols. And we as social scientists, as STS researchers wanted to collaborate with them. But this collaboration had an interesting life itself. Because in the beginning when we joined the project, and I think this is important to understand also how this book is written and how this writing was experienced by us. When we were trying to put this into words. They were expecting from us quite a quantitative sociological ish work, which actually is done today by usability studies, or usable security, which is a sub discipline that is studying how groups and systems are used by people. And this discipline proposes a quieter quantitative approach. Users are put into a table, they have to respond to kind of close questionnaires yes or no and the goal is to have an analysis, almost a statistical analysis of what are people afraid of? What do they protect, how do they do it and so on and so forth. So half of our team was expecting this kind of work from us, but we decided that we will use these expectations ox also to understand how cryptographers imagine users how they envision and embody users through the stables through this kind of quantity descriptions. And we tried to not just do what they want, but also do our job and analyze their reflection and our collaboration with them. Another objective of this book is to analyze the arena, which is the encryption and secure messaging. It's a galaxy of projects that are rapidly evolving. And even now, they're still new messenger messengers that appear every now and then people have many apps. They are kind of diving in this mess of messengers, the notion that we use in the book, and this phrase was coined by Holger Peck of one of our colleagues from Nextly team to describe the vibing and diverse scene of messaging apps that propose encryption and privacy by design below. I got some revelations and also followed the very deep discussions in the cryptographic community about how do we go from now when users when the public is interested in encryption on it's not just a matter of fundamental scientists, but it's a public matter of political matter? How do we deal with this? So the core idea of this book, it is a social science research but it is deeply embedded amongst technologists. And it also we hope, can help to improve technologies. One of the outcomes of our work, for example, is very practical because we collaborated with Delta chat, which is one of the Secure Messaging apps that we started and an SDS informed usability research was conducted to help them to improve their design. So this is very interesting, because my personal journey as one of the two authors of this book was from being an STS postdoc, very, very curious about things to become user experience researcher in one of these messaging app projects
if we can Sorry. Yeah. So So as I said, this is a very rapidly changing field and no systematization was done. Back in 2016. of what was happening. We wanted to provide an analytical portrait of this field, but also use STS conceptual apparatus to describe this phenomena. And this book we hope will be of interest for technologists. and STS scholars as well. But we also hope to maybe if you will, some discussions on practices with and around privacy protecting communication tools. So, when we started this research the fieldwork was actually quite challenging because no systematization existed in human science at least. So we did a mapping of all the projects that we could find that would offer end to end encryption. That means that the messages are encrypted not just on the server, but on users phones or computers before they're sent. And the user receives an encrypted message and receipt and decrypt on his or her device. So this seam is this field is still in the making. Some of the logos on the right you may recognize for example, signal or Telegram, which is a very problematic case and we dedicated even a special paper after the book was written. We decided to dive a bit deeper in telegram case and its success in Russia is controversial success. So in this book, we try to talk about various messaging apps and not just about the apps but prove the entrance via a particular case study. We try to talk about the architectures that they are based on protocols and some of the drama because crypto scene is full of drama.
And the book is actually when I was talking to you about this gap now because technologists from next leap team have their own idea of what users need. And then there are users who have their own idea of what they need, and what encryption can and cannot do. So we wanted to study these interactions of developers and other stakeholders, for example, how developers and user users interact. Whether some of these messaging apps are informed by user needs or are they based on some abstract theoretical research about encryption. We also talk to security trainers including those who work in very risky areas. For example, we interviewed some of the trainers who are working in Ukraine and some of the journalists who who went to war because the war in Ukraine, he's here for eight years. And we have exam been examining how encryption helps or sometimes not helps to save people's lives. Of those who are who go to these risky areas and how they actually conceptualize risk and how they use these tools. But also we wanted to look at the interactions with standardizing bodies and weird configurations of standardizing or not standardizing or partly standardizing of protocols. And also, of course, the funding organizations and different business models that are behind these projects. We had a core common objective of creating tools that conceal for freedom, because next leap was not just us it was also technologists who are building protocols and some protocols are now in use. For example, auto crypt is one of the results of next leap. And there are many more, while differing in their intended technical architectures. These messaging apps that we are started studying, we started the target that user publics and their underlying values and business models also right. So we try to describe all these various classifications and provide a summary of what's happening in the field. Sorry, I
had muted so what we what we tried to do overall in this book is really looking at what we call the experience of encryption. So the idea is really to, to look at well, observing that there is such a great variety of secure messaging protocols and tools today, and try to do some kind of not comparison, in the strict sense, but in that categorization, of like, what what they do in terms of architectural choices, business models and so on, with the idea to see what what these endeavors mean for the making of digital liberties on like, in the daily activity of developers and in the daily preferences and choices expression of users. So, just to say a little bit about what we drove from, so there is this this idea of the bringing a social science perspective on encryption that we we really did not start. But we we focused on a specific interest that was really under explored so far. But we do draw a lot on a number of scholars, including some of them really active in in iamcr By the way, so from the work of like every other column and hit the freedom, but also defining the last recent work on the practices of data activism, are no hints and in advance you can and others work on data justice, and also the body of work that observes how encryption is a matter of competing imaginaries that helped construct specific understandings of freedom. Sara Meyers West and Isidora Heather green had been writing papers about this for example, and also, I went to site Lindemann says a recent book about crypto politics that retraces a number of controversies that generate specific ideas of security, citizenship and privacy. So we try to merge this first social sciences angle on a very technical and complex subject with another trying to keep on unveiling what decentralized technical architectures are about from a social sciences perspective. So in summary, we built it on the some of my previous work on what I call the dwarfs without giants, alternatives to Internet services. So examining the concurrent push toward different types of design choices as a source of social technical, political, economic, and legal tensions. We will also draw back to of course, Laura's work or the analysis work as on Internet governance as a set of of control points. And we we have crossed recent work with on movements in decentralization of the Internet and Internet services. So from the work done on the blockchain, to the ways in which privacy is preserved through architectural based means and we go back to what what Phil Edgar said in the early 2000s. That architecture is politics, but it should not be understood as a substitute for politics. So in this in this book, are we really in several instances that look at all the ways in which the linear the picture of a linear translation between a technical architecture and something else that has a political value is complicated, and made more difficult by the reaction of different actors and different choices that have to do with both the economy and, and technology and so on? So is, this is back to
you, I guess. It's very much connected to the architecture question because we decided to divide our chapters, some of the chapters of the book, according to the architecture of the messaging app, or apps that were the kind of use cases for the chapter. So we, we analyzed after the 30 messaging apps that we've first mapped we found out that the three main architecture types that were there are centralized or peer to peer or federated. So we decided to write about every architecture and what it does and what it does not offer to users and also why the advocates of centralization, for example, think that it's much better than going federated or decentralized. So we tried to also map controversies around this architectures. The first chapter of this week, like the first use case of these big three chapters, is signal. This is one of the most famous messaging apps that offer state of the art kind of security. And what's interesting about signal is that it is centralized and its owner Moxie Marlinspike is defending centralization. He has written a blog post that has become kind of iconic text for many developers, the quality of the ecosystem is moving. And in this post, heat and defended centralization is offering much more security for developers because they can control all the implementations of their product. Well, if messenger is decentralized, then there might be some leaks or some mis implementations failures and so on. So he insisted on the advantages of centralization, and we tried also to understand his motivations. Even though we managed to interview Moxie. Here he actually openly asked us to not quote him explicitly in any text. So it was quite tricky to write about a signal. So we chose and apophatic way of writing about signal by asking other people from other projects perhaps to do something that signal protocol to tell us about how they use signal protocol because signal protocol has become kind of a basis even for Facebook's messaging app for whatsapp that also uses the same encryption protocol Meier and your matrix which is a decentralized app, but using signal protocol. The signal case is also interesting in regards to standardization, because they decided to openly not standardize their protocol and even use some kind of what what is called obfuscated open source, which means the protocol was out there, but it wasn't not documented. So it was very difficult for other projects to implement it and signal offered.
Implementation as a service kind of business model by providing technical consultancy and help to projects that wanted to implement signal protocol. The second case that we tried to use to understand very variety of protocols and offered by secure messaging apps was leap, and pixelated, which is kind of the same project. And in this chapter about Federation, we what we try to understand is what is new that is offered by Federation that is not there in peer to peer or centralization. And we found that basically, a federation offers four C's, which I think is a nice thing that Francesca came up with. It's her kind of creative touch. So the four C's of Federation is care community customization and compatibility. So, federated messaging apps rely on a variety of community factors. And they will give you just one example. The most famous is email. Actually, email is a federated communication system. And email is pretty much dependent on email service providers who maintain servers and who care for the service. But it's also compatible. So compatibility comes from this I can send you an email from my app CNRS afar to your Gmail, I don't have to force you to also via the ad CNRS of the car. So this is what Federation can propose. And in the end, we also looked at peer to peer messaging. It's something that is around for many years and peer to peer is full of promises. Also politically wise, because we spoke about architectures as politics but not replacing politics. Peer to Peer is kind of an anarchist dream of directly communicating communicating between each other without intermediaries without creating this our kind of stakeholders that will keep all your data on their servers and so on. Direct communication between equals this is the promise but in fact, now even in 2022, the fieldwork was finished around 1819. But even now in 2022, there is no secure messaging app that would be peer to peer and easy to use and offer for users. An easy management of group chats, easy and kind of simple interface and so on. So in these chapters, we have described this three types of architecture, putting them in a broader ecosystem of projects. And we are coming to our conclusions because we want to also have discussion. And first some of conclusions on decentralization, which are, I think, important to advance the discussion about decentralization. We wanted to show that the choice of centralized technical architecture is very complex. Compromise is context based. And it's not just the result of choices between abstract models, like cryptographers or not, they're choosing abstractly to make a decentralized or centralized app. It is very much a context driven economic driven, and politically driven. So and peer to peer also, as I just said, is not ideal solution for all problems. And another conclusion, I think it's still mean. Another conclusion that I think is important, like set of conclusions, concerns encryption. So we wanted of course to nuance the framing of encryption as a mechanism that allows criminals to conceal the content of their communication from the judicial system, which is quite a popular statement of many governments, like the five eyes for example, arguments against encryption as something that helps there is we wanted to show that encryption needs to be legal, as digital technologies not only increase the possibilities of surveillance but they do so in a fundamentally astrometric manner that is incompatible with democracy. And this is symmetry. We see it for example, now in the case of war in Ukraine and Russian surveillance say that has grown so big that the resistance inside the country is completely like transparent. If it was not for encryption messaging apps, for example, signal or telegram that's helped people to coordinate a democratic movement within an authoritarian country. And of course, encryption helps partially restored social norms around communication that were expected before the digital communications appeared, for example, the right to privacy and the right the right to have secrets and the encryption and the ability to compromise it is an intensely political issue. It's not an outright proxy for power. As we also showed with our recent research on Russia.
So we we also say a few words, in conclusions about standardization, because it has a large part for example, in the way in which it signal has become such a prominent actor in the secure messaging field with its informal standardization processes, so we have had precisely notice this, that developers in the secure messaging field actually are increasingly emphasizing that standardizing bodies are very much institutionalized. It's a little bit what Sandra was mentioning in the chat about the ATF and not to making a lot of consensus when it comes to quantum computing. So it's a little bit the same thing happening in this field that a lot of developers do feel that there is this progressive distancing of standardization bodies, even those that are pretty informal and made up of members of the technical civil society, technical community as well, such as the ATF from the coding community. So this often creates an environment that is felt as less suitable for experimental and unfinished projects by the developers and so, we have seen the bootcamp in several ways. For more standardization is felt as being neither necessary nor nor desirable. So we also say a few words in the in the conclusions about the general data protection regulation. We do not make a treatise about it because it's, there is really a lot to say about it and several other people are doing it very well. But one thing that we did wish to emphasize it was that there has been one core hypothesis that has a substantive the creation of GDPR that might might affect its effectiveness when it comes to these tools for secure communications that we analyze and more specifically, there is this interesting premise about GDPR that it is fundamentally based on the paradigm of centralized architecture. You can see for example, when the data controller the use of the data controller is is outlined that they are implicitly assumed to be the largest centralized servers under the control of external entities that are in the clouds in charge of the cloud. So what we what we suggest in the book was that is that there might be if there is the time and the opportunity to to refine and update this in the future, perhaps a better account for what is what is happening in in fields like this one of secure messaging. So, to sum up, we describe encryption today in this field of secure messaging. Also through the lens of what what STS scholars of intimate governance have called the mundane practices. So emphasizing the the daily dimensions of Internet governance both on the side of developers choices and and activities and user activities, but still deeply entwined with the institutional arenas, standardization, international bodies, or supranational or national authorities that try to regulate issues related to digital liberties and communications such as privacy or freedom of expression. And STS can help to analyze the ways in which all of the above is entwined and continuously crosses back, both in terms of actors and processes. So this is it on our end for now, though, we do look forward to to answering questions and we do we give the the Florida video floor back to your for for some comments. Thank you so much. stop sharing the screen so that we all see one another better.
Okay, thank you. Nice presentation, cerium and Francesca. So I was asked to reflect a bit on the book, having read the book and going into some topics and possibly that way lead up to discussion questions, because there's already a question by somebody in the chatroom, let's move to that later on them. So, as quite
nicely being said, you couldn't have a more timely book at this moment than this one, I think, in our field so much on topic, given the current level situation, of course. And also regarding the core message, which we have in this book in which presented now that this how these encrypted messaging tools have really at the center of what the authors call a powerful double narrative. And so on the one hand a strong positive discourse around empowerment user empowerment, citizen Parliament's and in that way better protection of fundamental civil liberties and how they enable this to do but on the other hand, also very strong critical discourse, especially by intelligence agencies, but also political parties. Sometimes and other factions of society, the allegations concerning this technology regarding fostering crime fostering terrorism and you shouldn't allow this and so on and so on. So on how to balance this this is exactly what this book nicely does. And also bringing in this perspectives from the fields that dealing with these issues. As there are signs of technology studies on Indian communication studies. So on two levels, I would say this book the societal relevance, as I mentioned already, I can even give an interesting anecdote about Belgium at the moment, you might know or not know, there is a proposal for law in Belgium, on forcing all communication players to, to retain to store metadata, all the metadata they have. This is a retry of an old former regulation where data retention was an in the end abolished by the European Court. And now the Belgian government thinks to have found a solution not to store the data themselves, but to force the storage of metadata, which, in fact, would prohibit secure messaging like signal like other systems, like if this law would pass. Now, there's a huge discussion taking place at the moment in Belgium about Yeah, we have government on one height, more right wing parties on the euro and the greens. And so it is again, very interesting discussion happening on this double narrative at this moment, as should we indeed allow this kind of thing that the government has access to this kind of metadata or is it not in the interest of civil liberties to do this or not? But of course, even more societal relevant and it was mentioned rightfully and also added on the website. SR and Francesca added a comment on an update on the book about the situation and what's happening with invasion in Ukraine, and who happening there and how this, of course plays out in this field of secure messaging and how these tools enable protecting people that are fighting for their rights, but at the same time also being used for spreading disinformation for misinforming people for hate speech, and so on and so on. So it's a very, very double edged topic and very nicely in that way framed within the book, I would say, especially given the situation Korean it's even more now, not so much a matter of freedom as they eloquently put it, but also matter now saving human lives in the end, and that makes it of course, so much relevant in this case. Also regarding societal relevance, I think it fits in the current debates also a lot on the technology scene on the move to decentralization. Or how they call it the Odyssey V. Decentralization. On one hand, for example, things like edge computing, where the the intelligence is moved towards the edges of this scene and not so much in one big cloud system, but also regarding developments taking also taking place in Belgium Flanders another project I'm involved in regarding data vaults, the whole technology idea but solat and Tim Berners Lee being involved with his own company interrupt or developing data ports with individuals themselves hold the data and only give access to certain kinds of parties, often oriented at private parties like Facebook and other social media. So also there you see decentralization and decentralization being very much taking place at this moment. And finally, on the societal relevance, I would say the move towards interoperability. And you have now recently the Digital Services Act in the making, there's still a lot to be buying there. But the first proposals go in the direction that the messaging systems would be forced to be interoperable, so that you cannot have silos of messaging, and they should operate in a similar way as also being mentioned in the presentation. As email is happening that you it's not important which email provider you have, you just can send messages to each other and, and besides the technological
initiative being taken as being very much described in the book is also on the legal regulatory front is not happening regarding interoperability. And it can only be applauded, of course, because it avoids a kind of avoids the silos within these kinds of system and path dependency in a way. Besides the societal relevance, why I very much enjoyed reading the book is of course, the guide. We are the kind of way that helps thinking within the couple of disciplines, very much oriented at these issues, and foremost, of course, science and technology studies, but also Media and Communication Studies on the crossroads. with computer science. If you look at double the weight within Media Communication Studies, as looked upon these kinds of issues, especially when you talk of media technologies or digital technologies, it's very interesting to see and also being reflected in the book. Is this always this three rate perspective as LIFO Livingstone have described at once, at the mutual shaping between on the one hand, we look into the artifacts and technological artifacts, what are they able to do or what is happening, but also what is not possible? And so what kinds of things are out of and the affordances these kinds of artifacts based on that have, but also this needs to be complemented and it's also nicely done by the interviews being done on the books with the practices of people, different users were also stakeholders, other stakeholders like civil society, and civil rights organizations and so on and so holiday, in the end give their own meaning on this and how this is not so much only a technical story but also social cultural user oriented story. And this is really nicely put in the book that it's not so black and white situation even when discussing the scoreboards by by Electronic Frontier Foundation, also, they're indicating how these kind of things cannot easily be cut up in a very black and white situation. And you need to press perspective to look at this from a kind of social science, user perspective. And third level besides artifact and practices for Livingstone talk about the socio economic arrangements and this of course, relates to the whole issue about standardization regulation, but also the business models behind these kinds of tools are established. So like where is the money coming from and how might this influence the way these things are being operated or having have have the kind of ability in the end? So, that regard, I think, when you look at the way this and as mentioned also in the presentation, the importance also of these mundane practices and mentioned as one of the three levels I just mentioned, and how this has related also some intriguing concepts and discussion for example, like digital migration problem where how to if systems are not interoperable? How do people engage with this migration provider? Move to another one, because even leads to for example, security trainings, using some kind of technologies which we some people in the West it's described as no goes like WhatsApp, Facebook, meta kind of things, but when you look at it from bottom line, how to protect yourself and have not have too much hurdles of bottle bottlenecks in order to protect yourself, well, it might be just as easy to go into systems like WhatsApp, then on the end end to end encrypted. Of course, you have the big gulf from players looking into this kind of things on especially on the metadata data level. What if it's a matter of life from that or a matter of being free or not? Yet then these kinds of things are almost a luxury problem. And that is some interesting reflection. I also made out of this book, so how to balance these kind of things and how to think about nice things and also not too easily to cut up these things like high risk and low risks and so on, which they also kind of nicely criticize in that regard.
Well, they're also very much liked about it is the own experience. I have of doing many, many technology oriented objects with engineers, European project, but also elsewhere. As they frame it being useful sociologists in a team of technologists as the kind of thing you are confronted with, I and I've been used to it now although it still keeps on being pushed into this kind of direction as a researcher, social science feature researcher that in a direction which technology people understand meaning, if it's about users, it's user interfaces. It's human computer interaction, that usability, that's something they understand and so and it's easy to frame towards them, or in the other way to go to research and deliver some big survey, which then indicates how much usability of new user friendliness. So kind of two has an and they tried to get over this within this book, but you see, also within their writing the struggle to deal with this and also to publish for example, they need to go to the level of usability to some extent in order to make the link to be a useful, meaningful way. Enter this to do meaningful interdisciplinary work in a way that they understand the other side meaning technology, some of the things you are doing, but yet of course, also going over this when they describe more than theoretical links to this. And final remark on the book is, I see a lot of links also to some very interesting work happening lately by Sarah grixis on computational infrastructures, where she talks about how see it's often not so much an issue about privacy even or data protection, about about market dominance. And then you go more into political IQ at that, giving the example of the corona tracing apps where in the end, it was Google and Apple that decided what kind of protocol what kind of way these tools should be built. Locally, they chose the design way, but for the same reason, they could have chosen the centralized way. And so it was no debate at all, no democratic discussion, what should be the outcome in the end even to it? Were these two parties that forced friends I think also, but it's okay to do us a discount to another centralized to and so that's nice, because in the end, we had a decentralized, much more privacy friendly tool, but it's less nice because it shows the dominance of these players in deciding on core metrics. Is the end. These are not health companies. These are to one extent advertising company, a lot of stent technology company deciding on how much like cola tracing apps, which is kind of strange if there is no democracy, most critic debate around this in the end. So then some remarks on that level. Also. Finally, I was just visited the CPP Conference, which is a very interesting conference on privacy and data protection, and a lot of debate happening there. Also, last week in Brussels on the issue of decentralization and the possible solutions technological solution, but also regulatory solutions being proposed at the moment there one of them, which I didn't know about, for example, was an alternative of talk called nem. And why m where a lot of very prominent cryptography scholars are involved in to kind of build another system, possibly probably more sustainable because also based on some subscription fees, and so on. So let's see what you see again, reflecting back on how how relevant and how timely it is, all these elements are happening at this moment. So took up enough sufficient time. Other comments, which I could probably possibly introduce later on, but I first want to give the floor out to the people who are present. And perhaps first of all, I'm not in the REIT questions to ask question, but I saw earlier question by by Sundar Raman related to another technologies and she's involved in called quantum computing. I think it was a summer would you perhaps say the question after reflection you have
a thank you for those excellent comments and thank you to the speakers was a really interesting book. The question I actually have was, of course, the punch line readers want to know, what's your recommendation? What should I be using? You can't you danced around that you had critiques of two versions, you didn't offer a critique of the federated version, so there was implicit but if you're willing to make a positive statement, it would be very fair in the comment earlier was about quantum just adding to it that it wasn't a question it was just a comment because I'm a fellow at the quantum network Center at Center for quantum networks that University of Arizona has won with the US national science foundation money in this regard. I know there's a lot of activity in Europe as well. Around this just saying that I've been astonished. This is my second year with them. I'm astonished at the extent to which they aren't paying any attention to the IETF or even question of protocols that first meeting To even begin to discuss reaching any agreement. On what might be standards is this fall in their case for its for very good reason, because they actually, there are really, there are multiple different ways of building a quantum network that differ in such extreme along many very extreme dimensions. That it's actually too early to reach any consensus, but one would think it'd be paying attention to what's actually going on at the IETF and another standard setting bodies there are two RFCs already at the IETF dealing with quantum networks as an active research group dealing with quantum networks and there are there are seas being discussed that deal with quantum networks at the IETF so the internet's looking at Quantum quantum isn't yet looking at the Internet.
Thanks a lot for comments Sandra and for the question, I will start on Francesco maybe also add something regarding the advice. Do we'd have something to recommend? I think that the answer is in the chapter that talks about threat modeling and risks, because we were analyzing how trainers recommend certain tools and the basis of these recommendations. And we found out that there was a transition from kind of tool oriented trainings that will put one tool in the center and service for example signal that's great everyone install signal I will help you install signal number four and so on. Now the trainings are more about understanding what are the situations of communication, the communication acts that you are yourself in? What are your social graphs? Can you please for example, describe group the context that you have and describe these groups that you have, for example, from the university colleagues from this other research project, that is a risky area, my friends, with whom I go to protest every Sunday and so on. And based on those outputs from the users. Trainers would recommend certain apps for certain communication acts and some will be very context based. That's why we do not come up with recommendations. But it's true that we have a slight kind of subjective preference towards Federation. And I think that this was a little bit mindful because it was still involved in a federated messaging app was called. And I decided to help them because a federation seemed to offer some answers to the most urgent questions and was also underlining like the one of interoperability for example, data portability and migration. So this cannot be offered with centralized solutions. This is hardly offered by peer to peer solutions. So the one that is left is veneration. And we see it from the rise of fediverse, for example, which has a big success now, in response or some of the Twitter scandals to the effect that so many social networks like Twitter have promised us and like global village or everyone will discuss with everyone, but in the end, they had algorithms because the data was so important the volume was so big, has to kind of penalize this data in certain bubbles, and for diverse offers a different model also for location for security, and so on. So there is a slight preference towards Federation as an architecture but within this galaxy, there are many, many, many projects. So we don't recommend any specific project per se. This is also maybe in the ethics of research, to recommend anything. What we recommend is maybe to analyze your own threats and your own risks and also understand risk as a dynamic ever, ever changing process. Risk is a process is not something that is done given forever. And understand the risk is relational because you are always connected to people who are weaker in the sense that they experiencing more threats than you. And this is a lesson that I personally learned from from a person who we were interviewing with his book, or was himself in a very safe space of Austria, you know, like, just making festival things. And she said well I have to move to all secure messaging apps because I invite people from Iran or rain or to speak at my festival. So this will be maybe my advice is to tailor recommendations according to what you experience and also, what I learned from trainers observing them, and especially with trainers who are not from the global north, but those who work in. As I said, Iran is one of the cases that we had. I learned to humble attitude and understand the entity with your users. Of course, recommending PGP is nice, it's cool. But it's also hard for people and making a step towards the user, giving her a hand and saying okay, this is hard for you. Understand you don't have time to figure out so this other app that is easier but also protects you so that that's that will be my answer.
Can I ask two quick follow ups so that was a terrific and valuable answered? Senia. The two follow up questions would be one. What about supporting others? So at one point red the lantern was really important in your brain. So I subscribed Atlanta and actually as a way of supporting those in Ukraine. Did that make any sense? I'm not actually using it. I just was trying to support them. And the second question is,
how to
deal with the boundaries of what it is we actually know about in our network. I remember actually when I was section Sharon iamcr was receiving lots of requests to get networked with people in LinkedIn, from people who I assumed were in the association but whom I actually knew nothing about. Didn't remember if I'd ever met them. You know, maybe my failure but wasn't sure I'd ever even met them. And it raised it made me very conscious of that question. What do you actually know about people and so how do you deal with the limits for that other set of your very important answer?
Thanks. Maybe I will answer to the first follow up and Francesco you want to react maybe on social graph kind of thing. If you want further, was near for supporting projects. I think that what you did is really great because what our book shows is also different fragile, economic status of many of those projects. They are made by often made by volunteers and most of them are free except for flour, which is not and they are using not using your data also, unlike WhatsApp is living so they are based on grants that they have from different governmental and private actors, or on user support and donations. So I think you did is really a great gesture of support to developers for a free service for people who are really risking a loss thank you for being really responsible. And thank you for this gesture. It's really important that revealing this economic fragility and absence of an answer like how can we make these projects last without transforming them into traditional businesses or startups?
You can go for the second part as well, because I was I was thinking to, to maybe add something after we with regards to the measures and so to tune in to this question of the relational risk chapter that we didn't talk a lot about. So go for it as it was like a follow up to your question, and then I did.
This is really a hard question. I never thought of that. Of course, we don't know all about our communication partners. But I think that's I would use a general kind of presumption that people may have something to hide. And this is better or not better, but it is more respectful towards your communication partners than to think that they have nothing to hide. I compared this to COVID. At some point when I was talking recently at another conference, like you have been tested negative So you think that you can vaccinate and so on so you don't wear a mask. You don't know what's happening people around you and like in a time where it was super important for everyone to take these measures, we have the option that you may be in contact with a person who is at risk. And even if you have a light kind of version of COVID you know this this is really I think this example just even if you have nothing to hide, let's pretend everyone around you is in danger. Or also, I wanted to refer to Russia, homelands, and this is a pity, because I'm looking at what's coming next 10 years ago things that are not being killed or illegal. But I have an intuition that they might one day become illegal, so I'd rather protect myself. They're already illegal now. Living in countries like Russia, you have to behave as if the state was your enemy every day, like all the time. So this is kind of the maximum security that you can offer. And I think that well that's what I learned from from the danger you're thinking of the most horrible parts of your network caring for them is the kind of ethical guideline in communication. Yeah, I guess just to
follow up pretty quickly on this, if I had a, like, concisely so to answer your question it would have been with the I think, what is the last sentence in the chapter we wrote on the national risk and threat modeling part, which is that we should behave as if we were always protect the person that in our social graph in our network of context is in most of concealing like even if it does seem to us that there is a situation in which we are probably we can live out in the open and then we should always consider this. The fact that we are always situated in this broader work even if we are not necessarily directly interacting at the moment with with the most recent and sometimes make some choices that are otherwise Yeah, what I was, what I would do in terms of how to recommend this is something that actors such as the Electronic Frontier Foundation, and other NGOs, concerned with digital liberty, attempted an answer to and one of our chapters actually retraces quite in detail, the attempt of DF to promote a guide through the this maze of communication tools, and we found it really an interesting place because they did a first version of this guy that was really very much tables and adds on one on one end. And on the other end, well it can this be lighted, can there be a green light? Or do we have to put across as they do not fulfill this criteria? And progressively and we were fortunate enough to them about what have done the subsequent choices, they moved towards pleasingly qualitative guides in that respect to the point in which like this whole process that lasted several years for them ended with a blog post on the IFF website that was like them not been able to actually make a guide from the visual schematic appreciation of the tools because it was really depending on one's personal journey and what file they started to present to the world was and what their backstage story was. And so, we finished that particular chapter by resuming this as sometimes in the sense that they really seem to to arrive to the conclusion that a very qualitative and tailor made approach such as the one that I was speaking about with the trainers, increasingly so seems to be most appropriate one.
Thank you for your service, and Francesca for the other questions. I have, what's your question? Not so, again, the core message of the core was that as I mentioned in my comments, Jamie and Francesca, is that it's a very difficult discussion, society wise. How do you deal in a good way with secure messaging? An anecdote from Belgium, Netherlands is that in the middle of 2020, there was a huge operation by the police and the breaking up a secure messaging app called encode chat enabled was delivered on special phones by Canadian clinical KY ecc. And they were cheering and celebrating this opening up the reaching of the security, because indeed, and they will do a lot of finding criminals, even torture rooms and malware, drug lords, to torture other people. So it was really, really big when they set for force for the police forces to even now still, people are being arrested based on breaking up that encryption. So this is used also by the government, of course by saying, look, it's important that we are able to enter these kinds of secret because they are used by criminals. That's the kind of typical person to get there. And especially with these kinds of things happening, there's only a subset substantiate their opinion on that and to in order to sort of debate so how do you feel about when these kinds of things are happening as to how could you say to deal with this or what is the position you can take in these kinds of discussions? Do
you want to start Sunday or shall I
Please start?
I'll start with that. Let's say a general principle and then perhaps x and you can change for her experience as an activist and from what's currently happening in Ukraine. So at some point there was this report by a number of researchers at the MIT Computer Science and Artificial Intelligence Laboratory that was cultured under doormats. And actually, this was quite a landmark report because they were basically very, very basically I'm translating a lot but they were saying so once the door is open, or at least a key is slide I never meant to for a particular individual to open the door. It is available to everyone. And so but basically this is the this is literally the conundrum of this of this double narrative thing, right in the sense that there is this idea that really if it's, if if encryption is broken, or or if it was needed to break it, it will never be selective and so once again, pretty much context dependent, the extent to which some benefits of doing so outweigh one another. So it's, there is no like one single answer to that question, I would argue and this is why actually, this is such a big political controversy, and each state has a debate on going on with this and yours is that there will always be and rightly so, examples coming from both sides of the debate that that one would feel morally right to uphold. It's, it's really, it's really complicated. But, technically speaking like the general you would have developers core argument to the keys under doormats, one if it's if the keys don't work for a particular state with the best intentions if it can generate significant collateral damage in other contexts, or in that very context. It should be carefully weighed against the what are the benefits
Okay, yeah, thank you. Just looking at the audience is there are questions that you have or restrictions or want to make in this regards.
I see. Yeah. A note by Maria on in the chat saying that there are around the Online Safety bills bill. In the UK. There are a number of big spending on these issues and some of the most interesting developments that have been post Brexit Yeah, I think that we mentioned among the signification context that we mentioned in the concluding chapter, there is there is the UK one, and thanks Maria for coming and for for attending.
So make 400 Common by Maria and deed. How this differences in opinions also met. If you look at social cultural differences of opinion on dogs, even between us and the European Union with us a lot of freedom of speech while European Union on the fundamental right to privacy and might conflict and what's happening there related to the messaging of telecom being deal dealt with and I saw notes on the website. I'm going to read adult so the debate on this and I have a PhD researchers is looking into this and looking into mainly how telegram has become a safe haven for a lot of all night and extreme rights parties and groups after they being deep platformed based on other platforms, and we joined on that platform. Doing quite some yeah some not okay things on that platform. And also the lack of possibility on the side of telegram as a platform. Meaning that even if you would try to set up some community rules or some norms in your group or what's called channel you can have a group chat and can have on mute, if you would want to do it. These are not the affordance of doing so. So you have no way to communicate to which you are forming on telegram to indicate Look, guys and girls. We're not going to do hate speech. We're not going to tag people or something like that. You're not able even if you would want to do it, you're not able to do it except when establishing a bot or something, which is also very high technical issue, which is not easy for all group chat moderators, but also indicating that the forms are have this this this shift ability of of tackling some of these issues but also some some toxic staining of doing this and while others like metal Facebook are being forced overlays but still are these doing some kind of moderation also on YouTube and stuff. So how would you say the viewed you talk about telecom but because the business model of telegram is quite strange it is it's a billionaire stepping out from within and he's funding this I think it's donations but somebody with a lot of money, I think is paying this tos. So how would you delegate this space? Was your home enough that
maybe a spark? Yeah. So just all about moderation in the telegram. First, there are possibilities to enable moderation but they are like, they're becoming now more sophisticated. For example, the new version of the ground courts, chats for which you cannot share anything. It means that they're really close and you send a voice message message it can only be listened and all read if it's a text message in this. So this is something interesting. So this is used, for example, by rational activists who are coordinating and hate their untimely ones might be shared from outside this room. So this was one step that telegram took in kind of assurance of privacy and security of the users and I'm personally sympathetic with this gesture of kind kind of enabling, disabling sharing by design. Boats becoming very easy now. Kind of both constructors, kits that you can use and sell. For example, I did a bunch of telegram bots. Now I'm not a developer. So the barrier has become lower for that. Moderation. This they they are already like a number of them that can kick out people deciding for example, if a user posts something and people move the message more than 10 times for example, the person is automatically out of the group and then you can customize for how many days or hours or forever person is bent so moderation is progressing. Telegram is interesting because yes, it is we have weird things in life happening, but it's also a community based platform because of the bots and API that enables community create creativity. And developers make a lot of broadband and extend possibilities of still around. Beyond what Biden and his brother Nikolai originally thought so telegram is becoming kind of decentralized in a way. And also I've seen numerous times on telegram would block channels of parole war, for example, people or channels that boast private information of untiring service. Dora would come and like vooraf administration would watch these channels if if these channels were reported. So some of the mechanisms work by the donor. If he's a libertarian, and he has stated and libertarianism is this motto of free speech. And the Free Speech unfortunately, includes alt right speech. And that's why he is not going to ban people who have untried islands within telegram unless of these ones are reported by multiple users and report mechanisms work. If there is a crowd sourced action, by many, many years report a chat logs. So in this sense, it is much more bottom up I would say that in Twitter from people have coordinate and Ben, channel Birimian many many many times no this is interesting, I'm glad. But I'm also very sad about how Toronto became a prison for everyone who won't abandon it, even those who are technically very, very advanced. And that's what they saw in our paper, that cryptographers who understand the failures and weaknesses of crypto protocol Mt. brought out which telegram is using, even they are banned on it. So this is something we heard especially for Russian audience just hooked like on drag telegram has become this big, big thing that keeps the Russian users addicted completely.
Thank you, Cynthia, with five minutes but wrapping up so Sandra, do you have a final final words final question.
In response to what senior just said about voting down in one of the things we're facing in the US, okay, I'm in Texas at a conservative years. See, but it doesn't go all over the US is that there are now organizations that train students on how to get rid of family members in who will say things with which they're uncomfortable in the classroom. And we've had multiple experiences of students organizing themselves through some of these things, lobbying professors and getting them in a lot of trouble even fired. So this is not an it goes in depth and of course, this is people trying to get rid of those who are communicating progressive views not all right views. So I'm not convinced that this was this was an advance in moderation.
I think that dogs are just like adults are super urgent. I agree. And I would like to say that so last week, at a pre conference to Ica we did have this organized some of this conference done alternative content. And and I would have to say, Senator, that there is a lot of a variety of things in which you can see things progressing on some in some way. And there's a moving forward but in very discussable ways, as well. So yeah, thanks for citing that during several years, but it's actually pretty awful like effective means to counter this
or fun soon.
Yes. Thank you. Interesting, I didn't know but a pre conference also very interesting Francesca to hear. So it's an urgent issue to handle. Totally understandable. I want to close off the session, thanking the speakers of Corbett for the really nice presentation and also very nice book of kindling the organizers. Also Bruce and the others, Andrea to help with setting up and the first making, facilitating this thing to happen. So perhaps we give the last word to Andrea to really get your closing words and then work on this.
Thank you
so much to the moderation discussion. Thank you again to franchisees. Congratulations on the book. It's amazing. Really happy that you know, I could also learn a bit more about your work. Thank you, everyone, for coming. I just posted in the chat a link for the next time for iamcr webinars, collective book launch by the journalism section. And yeah, thank you everyone who was here. Congratulations. And Francesca, Tanzania.
Thank you. Thank you all. Thanks for for attending. Thank you. Thanks see you around soon. See you soon. Somewhere.