The Behavioral Data Debate We Need
Our Executive Director Stephanie Hankey writes for Project Syndicate about how we use systems driven by big data in crises such as the coronavirus pandemic.
and because of that we have to write a very long text about it
by Marek Tuszynski
This text, also available in French, addresses questions about which tech is good, safe and appropriate to use in these complex times if we want to act and work responsibly and remotely. How do we decide which technology we should trust? It also discusses what could be done in the future to make answering this question much easier than it is now.
Trade-offs are usually not very appealing
In this time of crisis we are at a technology crossroads. We are facing trade-offs between what seems efficient and quick versus what seems ethical and safe. In either case, we have to deal with the long-term political and social consequences — giving up on our values or investing in equitable tech. Whenever something happens that forces us to rethink what kind of technology is available and how we should use it, Tactical Tech is asked by its partners and through its public engagement activities such as the Glass Room: what tools do we recommend that are user-friendly, functional and won’t risk our safety, privacy and security? In other words, we are often asked for alternatives to the most commonly used tools. Many people ask us to recommend communication, collaboration and networking tools that are designed with users’ rights and privacy principles as their primary design choice – and that work without the big data-hoarding complexes of Facebook, Alphabet or the like.
These questions have become much more frequent during the current coronavirus pandemic. The expectation is that we, or other entities working with technology and society, will recommend the ultimate infallible toolbox of ready-to-use tools. People want the list of tools they should install and they might not want to work through the questions about why to use them or why not, which are in truth difficult. There is an urgency to the situation and people are concerned about the choices they make, so we often end up suggesting some easy-to-use alternatives that demand less investment in skills, resources and time. But it’s not – and shouldn’t be – that simple.
Here we’ve compiled a mixture of explanations, advice and recommendations about technologies for working together remotely. You will also find lots of ideas about further reading and where to find it. The text is organised in four parts. It takes a long time to read but we hope it is worth it.
The idea that there are tools that would always work for everyone, everywhere; require no extra knowledge and zero additional infrastructure; are fair and just, and protect users at all times, is a dream that has not yet come true.
What we have at our disposal is far from that dream, very far – and running Tactical Tech for nearly 20 years means we have seen a significant shift from stand-alone tools with few data traces to tools that are entirely built to harvest as much personal data as possible. Unfortunately, today the majority of tools on which we are all dependent are made by businesses with an exclusively Silicon Valley mindset and a business model that relies on extracting personal data.
There are two ways we at Tactical Tech approach the problem of data control and technology choices: one is mitigation, or ‘making it less bad’. Responding to the need for quick fixes is the approach of our Data Detox Kit and Glass Room projects, where we appreciate the fact that many individuals, groups and organisations are not in a position to make significant or drastic shifts in the technology they are using. This can be due to a lack of resources, skills, knowledge, help or funding. Our approach with these projects is to meet them where they are and introduce incremental changes and step-by-step improvements within known environments. This approach does not solve problems but it does reduce some of them. It gets users on track to gain a better understanding of technology itself and builds confidence to make more choices and take more ambitious steps in the future.
The second approach is about a change of mindset, or ‘short-term pain, long-term gain‘. This is much harder to implement and even talking about it is not straightforward.Sustainable, secure, independent tech requires a much more significant shift in how we think about technology. It demands much more significant investment and resources, and regardless of the fact that we at Tactical Tech promote free and open-source solutions and self-hosting, we know this is not the way to go for many groups we work with. More practically, it requires significant resources: the tech might be free but renting servers or buying them is not. Maintaining them it is not free either: you cannot rely on volunteers and it requires constantly updating your skills and resources. It also requires training and a funding environment: if you want to support independent, secure, resilient and sustainable organisations you have to think about technology as much as you think about management or finances or human resources. These are essential tools for work and they need serious long-term investment. That investment should be addressed not only to the users but also to the makers. The reason some of these tools have usability issues, flaws in designs or a steep learning curve is because they are run by small, dedicated teams with extremely limited resources. This second approach not only requires a change of mindset but also a systemic, techno-political shift.
These two models describe the situation we are in: either we have to eat fast food or grow our own. The former is convenient but not very healthy; the latter is better for us but is time and skill intensive.
Can you build a democratic, equitable and open society with closed and proprietary tools? Have you been in a well functioning public space owned by a corporation? Have you experienced a functioning commons cordoned with surveillance cameras and sensors?
Most of the tools we use these days are not only cloud-based but were also created to be as simple as possible (frictionless) and to attract the largest possible number of users (social), who would in return become dependent on their platforms (user-based). These platforms are offered for free (you are the product) and they turn their user base into a data base (profiling) that can then be monetised (through targeted advertising or similar). We click ‘I agree’ to give them access to our behaviour (metadata) and facilitate cookies, beacons, scripts and what-not – anything goes, including fingerprinting browsers (trackers). One solution is to install an ad blocker, but the same data is used to shape your entire experience: filter bubbles, tailored bots, nudges, dark patterns and so on. Your experience of these technologies is far from being open and free; it is also far from high-quality. By now we know these things about the data dragnet industry but we use the tools anyway because everyone else does and they are so easy to rely upon.
Another thought worth bringing up here is about dependency. Any collapse or disruption to these centralised services used by billions of people, companies and even governments, would cause a problem of equal proportions. Large-scale dependency creates large-scale liability. We have already seen glimpses of what this can mean when services like Facebook/WhatsApp or Google Docs go down for short periods of time. What if they went down for weeks? These are tools like any other – they have their limits and their critical mass. The question is when they are going to reach their limits and with what consequences. It is not about if but when.
The last thought is about vulnerabilities. Is it really a good idea for the Prime Minister of a country like the UK to be running the government from home using an untested, insecure proprietary tool like Zoom? These challenges are not only for civil society. It is not only human rights defenders and investigative journalists that need confidentiality and trusted tools to maintain their integrity and safety; it applies equally to governments, businesses, public institutions and individuals.
Frictionless is equal to effortless. In the real world, friction is necessary for understanding systems and making choices, like in personal relationships. Similarly, when we remove friction in technological design, things becomes easier to use, but we lose our cognitive ability to understand how they work and what business models are behind them. This approach, combined with the ‘gamification’ of our interactions (reward schemes, endless scrolls, positive feedback loops), makes us dependent on tools with practices we may otherwise disagree with. The scandal with Facebook and Cambridge Analytica is a case in point here.
For more about this new influencers' paradigm you can check our Data and Politics project – in particular the report on Personal Data: Political Persuasion.
And if you are interested how these commercial tech solutions are funded, start here: "Is Venture Capital Worth the Risk? The industry shaped the past decade. It could destroy the next.”
Crisis, and in particular a pandemic like COVID-19, is not only deadly and dangerous to society, the economy and democracy. It also creates a fertile environment for promoting and relying on authoritarian modes of operation – and technology comes in very handy here. Technology that enables surveillance is always promoted in the name of (national) security and safety. This is not new and has already been exercised in other types of crises or in response to problems – such as terrorism, border control or movement of refugees. We can already see how many of the same technologies – such as mobile phone geolocation tracking – are being used in response to the COVID-19 crisis.
Check the list of steps taken to put quarantined people under surveillance in different countries here. Or if you prefer The New York Times read this. Also check how it works in the context of technology use in the EU border crisis in "Border Troubles: Medical Expertise in the Hotspots"
With all that being said, in this text you will find some solutions and references to solutions – not the magic list you are looking for but rather (and more realistically) a way of thinking about technology that might help you make informed, verifiable choices that could work for you, your community or your organisation. There are choices out there – the real challenge is not only how to make them but also what kind of society we are supporting by making them.
Some useful principles for choosing technological tools
Here we will explain some of the principles that have always driven our choices at Tactical Tech when it comes to choosing which tools we use, and we will make some specific recommendations. The idea is to give you a framework that will help you find the right solutions for your needs but also fit to your capacities and resources. If you disagree, you can always make your own choices, which in fact might be much more viable for you – and your answer might be G Suite or WhatsApp (Facebook) – it depends on what’s more important for you.
We always try to stick to these 7 principles when it comes to the tools themselves. They have to be:
If you think we are not addressing two important issues here – namely, data and encryption – you are right; that comes later in the text. Also, in regards to point 1, we prefer using Free Libre Open Source Software (FLOSS) – we will however stick to the term 'open source' throughout the text to avoid confusion about the 'free' aspect, as it is often confused with freeware, freemium or free of charge proprietary software solutions, which it is not.
The 7 principles above can be summarised in a short statement. For us, trust is the guiding principle. Because software is made out of code, we want to trust this code and not the promises of those who want us to use it. If you have a black box (proprietary software) next to an open box (open source software) to chose from, always go with the open box. Recommending or trusting a black box is risky, due to the fact that one has to take all the promises of the makers at face value. Outside of the company, others do not and cannot know if it is good or bad code. In fact, we know practically nothing about what data is collected, where our data goes, and so on. This does not mean that open source by default is better; in fact, it is also just some code written by some people with a range of risks attached. But because it is open it can be verified – and that matters. The other thing that matters about open source is that it comes with a non-proprietary license, allowing users to act independently, to share it with others and to change it. And it is free.
Unfortunately, open source has been overused as a description of software and this makes it confusing. Some components might be open source (say, an app you can run on your device) but the software running on the company server (cloud) might be proprietary.
As for the other principles, in times of crisis there is an unprecedented hunger for quick solutions. It is not only users who are looking for them but also investors, companies, governments, etc. This is also a time when a lot of innovations can occur. But as with everything else during a crisis, it is often chaotic. Demands focused on solving short-term problems trigger solutions that introduce new problems. There is little time to test, debug and even understand what is happening. This is why, for us, maturity is important when choosing tools we want to use. It might sound conservative – and maybe it is – but it is also responsible and rational because when we make decisions too quickly there are real, longer-term consequences. In addition, they are often experienced by those who need the most protection and care.
For an in-depth list and explanation of these principles, please visit our Security In-A-Box project, which Tactical Tech started and then co-developed with Frontline Defenders. Scroll down to the section on Criteria.
And if you are a human rights defender working remotely, please read Frontline Defenders' guide “Physical, emotional and digital protection while using home as office in times of COVID-19 - Ideas & tips for human rights defenders”
Here, finally, the part you were curious about: some examples of tools that we recommend and use ourselves, which adhere to the above principles. Hopefully you are already using some of the essential ones. We hope you are running an updated version of your operating system and all your apps are updated, too… easy to say, hard to do.
Our recommendations for Internet Browsers:
A must: choose the right search engine as the default one and install some add-ons, at least:
(maybe) Facebook Container
If you must use an online tool try Firefox Lockwise
Secure File Storage:
Connecting Securely to the Internet - Free VPNs:
Riseup VPN (free)
Proton VPN (free)
Lantern (free up to 500Mb)
TunnelBear (free up to 500Mb, not open source, good for beginners)
There are also many excellent paid options and we would strongly recommend looking into these, unless you can set up your own VPN. However never use a commercial “free” VPN, they make money on collecting data. If you need to know more about VPNs we would recommend looking at EFF's "Choosing the VPN That’s Right for You" . For a detailed list of paid VPNs with overview check here. And if you still need to know more about how to keep your private communication private, read another short chapter of Security In-A-Box.
Not every Cloud has a silver lining
Remember that ‘the cloud’ is practically someone else’s computer. There are big (black box) clouds – either servers that are available to you for running services (e.g. Amazon, Microsoft) or platforms that store and process your data (e.g. Facebook, Alphabet or their subsidiaries such as Instagram, Google Maps, Zoom, Slack and hundreds more). There are medium-sized clouds, too. These are run by relatively small operators that might have their own infrastructure but provide you with specialised services (open box, eg. Greenhost) or might provide you with the things you need, but also rely on the above-mentioned big clouds. And, lastly there are also small clouds – self-hosted cloud services (mostly open-box ones under your own control) run by institutions, organisations or individuals themselves.
Not all clouds are equal. At Tactical Tech we prefer small and medium-sized ones that respect users and their rights and freedoms, and those that rely on open source solutions with business models that are not based on monetising personal data.
‘The cloud' as a concept is very important – it allows multiple clients (laptops, mobile phones) to do more than they could otherwise. Your browser, or an app on your phone, becomes an interface to a more powerful and capable system, serving you tools and content from a remote location – assuming you are privileged enough to have cheap and fast internet. The advantage is that your data is more mobile – you can access it from different devices and locations and you can share it with others easily.
This is not the only model available. Many activities that rely on the cloud, for example file sharing, can also be handled by our devices and we can do many things directly with each other – this is what is called the peer to peer or P2P model. Bit Torrent, a file sharing technology, is one example. Many computer game companies use P2P to distribute their games (such as Diablo III, StarCraft II and World of Warcraft). Other examples are Tor, the anonymisation tool, and Bitcoin, the cryptocurrency, which are probably the most known peer to peer-based networks today. Users need systems that are decentralised, distributed, allow interoperability and peer to peer communication and do not require heavy and expensive central infrastructure.
The fact is that everybody who has a laptop or smart phone these days relies in some way or another on cloud services. Some people do almost everything in the (big) cloud. Providing cloud computing and services is a massive source of income for almost all of the major data-driven companies, including Alphabet, Apple, Amazon, Facebook, Microsoft and Netflix, to name the biggest. But companies like Uber and Zoom also follow the same model. They run the infrastructure and they collect all the data possible. What we get is the interface to that centralised infrastructure, with the benefit of efficient tools. What we lose is all the data that goes in to that accumulated and aggregated system. What we also give up, perhaps more importantly for society, is the aggregated knowledge.
Favour data minimalism. In using and choosing cloud-based services, look for services and tools that collect the minimum data possible, that share the minimum data necessary and that store the minimum essential data. If you have to provide services to others, such as organising events, keeping mailing lists or running surveys, if you don’t have to collect data, don’t do it. If you do, be transparent about it, minimise it as much as possible and delete what you no longer need.
If you have no choice and you have to rely on Facebook and other popular platforms, Tactical Tech’s Data Detox Kit offers a set of easy steps to help you take better control of your digital privacy, security, and well-being while still using your favourite devices and platforms.
For us, helping users become being aware of what’s happening behind the screen of their devices, regardless of what platforms they’re using, is a small but very important step in making informed choices and understanding the politics behind technology and its impact on their lives.
If you are using or recommending Slack, Zoom or other online tools for working, learning or maintaining health services, please read this first: "What You Should Know About Online Tools During the COVID-19 Crisis"
And if you are an activist relying on social media read our Activism on Social Media: Curated Guide
The importance of encryption
Whilst we recommend online or cloud-based tools based on the principles we mentioned above, we also have to evaluate them on things that guarantee the confidentiality and safety of our communication, such as end-to-end encryption. What is end-to-end encryption? It is a technology that enables you to send something to another person, and only you and them can read it. Others who may see this communication only see it as mumbo-jumbo – however, they (usually the service provider) can see the metadata, meaning they can still see that you are sending messages, from where, when, how often and to whom. They just can’t see the content of what you are saying.
Lots of different services use encryption. It enables us to do simple things securely and confidentially online, such as online banking or shopping. Some communication services promise users end-to-end encryption, when in fact they only encrypt things between you and their server and then again between them and whomever you are talking to. This makes it secure when you are speaking, but the service provider has access to all the information unencrypted. In this sense, your information is only encrypted when it is in motion but it is not technically protected if from access, processing, analysis or being reused by the service provider.
There are some other limits to end-to-end encryption. Sometimes companies claim they are using it, but it is hard for an average user to verify this. Some services are opt-in and others are only partial. For example, encryption might only work to certain scale of use, e.g. only between two participants, but not for group chats. Or, it may work well when you use a service one way – for example, your iPhone – but then not work for other related services, for example back-up. This makes it tricky for users to navigate and make good decisions without checking all the details.
If you want to explore what end-to-end encryption is or how it works, please read this accessible and really useful explanation from EFF.
While we are talking about encryption, the other issue with it is that at times of crisis in particular, it becomes a focal point of debate. Whenever we look into online crime, misinformation or fraud, it seems encryption is part of the discussion, with some claiming that it enables these wrongdoings and malicious activities. There is certainly some correlation – but causation is much more complex. If a terrorist decides to use a car as a weapon, for example, that does not make all cars a weapon of terrorism. By the same logic, if encryption is used to plan a criminal activity, it does not mean encryption itself is criminal.
A good example is WhatsApp, which encrypts users' conversations but collects the metadata (as explained above). WhatsApp has over 2 billion users and the fact that their conversations cannot be read, even by WhatsApp, has its strengths. Encryption enables conversations (I may have nothing to hide, but that doesn’t mean I have something to show you, either) and (for now) it protects them from targeted advertising based on what they are talking about (although Facebook is always looking to monetise their user base, so who knows what will change in the future).
For some encryption sceptics, it is the primary enabler of misinformation. Because encryption enables secrecy, some people believe that it provides a secure channel to organise wrongdoing – and that is definitely true, encryption enables confidentiality, security and secrecy. You cannot have one without the other – and it is exclusively in the hands of users. The real challenges are more nuanced. The scale of the encrypted channel is what allows malicious disinformation to be circulated. The actions of its users are what allow banal misinformation to proliferate rapidly and exponentially. Encryption is indeed also used by criminal rings and other offenders, but they will most likely simply pick up another tool if encryption is removed from WhatsApp.
But all that does not mean encryption is bad. As in many cases, tech is neither good nor bad here: technology is stupid but never neutral. It all depends what the intentions are of those who use it. It could be that the problem here again is the business model. By choosing these tools as our primary communication tools at scale, we support monolithic centralised systems that operate at a planetary scale, causing planetary problems.
There are other aspects to this issue like the cross-platform spread of misinformation (despite encryption) and the responsibility of users. Have a look at our recent post on this topic for a deeper analysis of WhatsApp and misinformation.
In summary, some things to remember:
Let’s turn those principles into some recommendations:
Communication and chatting
For voice calls and messaging:
(both Signal and Wire can be self-hosted on a server, but it’s not simple, since you would have to create dedicated versions of the different client applications)
(an interface to matrix, here you have to enable encryption per chat room and key verification is a stretch from the user perspective, you can also self-host it)
Alternatively to Riot we would recommend looking into Mattermost which has hosting options or allows for self deployment.
To read more about why we prefer Signal over WhatsApp, check this post. If you are interested in the use of WhatsApp in the current context, read this interview with Tactical Tech’s Stephanie Hankey.
Voice and video group communication
For voice and video group conversations we recommend using:
Jitsi is not a single service but rather a tool that can be run by anyone anywhere, hence there are many instances that can be used, unlike tools such as Skype or Zoom. Some Jitsi hosts you can use include:
An extensive list of Jitsi Meet instances is available here.
It is worth noting that Jitsi Meet offers end-to-end encryption only for calls between two people (this is the limitation of this kind of video and audio conferencing in a browser with multiple people). You can use it with more people, but you have to trust the host. This is not unique to Jitsi Meet: the end-to-end encrypting of video calls for multiple users is limited to a very small number of participants for majority of tools available; this can be mitigated by using transport encryption and trusting the server in the middle.
From our experience Jitsi Meet does well with small groups – we would say up to eight to ten people – then things get a bit bumpy. It also demands lots of resources of participating devices – either stretching their capacity or draining batteries. That said, we are using Jitsi a lot.
If you require an environment resembling a class room set-up, with whiteboard, many rooms, chat, voice and video enabled as well as screen, presentation and external video sharing - we would recommend using BigBlueButton - this is especially viable for those who can self host. Great for team meetings and webinars - as it also allows external participants. It can handle more participants then Jitsi Meet, and for the user it is an entirely browser-based experience.
If you think Zoom does end-to-end encryption you are wrong - they offer transport encryption, which means everything you do on Zoom might be easily unencrypted on Zoom servers. This is one issue with Zoom (along with a host of other problems – more on that at the end).
We understand that you may need to speak to more than a few people at once, especially if you are trying to run online conferences or courses, but at the moment Zoom has too many issues that it needs to resolve and we are all still seeking viable alternatives. At the very least, think about using different services, or, if you can, break up your work into different pieces. Can you live stream some parts of it? Can you run a webinar, and pre-record some of the content? Can you change the scale of some of the interactive parts of your work to multiple smaller groups or use multiple channels (e.g., one channel for collaborative documents and another for audio chat only, such as Mumble)? Whilst we don’t have the perfect tools available, we may have to be more creative about designing virtual collaborations at scale.
Lastly, for video hosting, we mainly use Vimeo, though it does not meet our criteria (it is not open source and there have been other recent problems with it). We are currently experimenting with self hosting via PeerTube
For collaborative tools to share calendars, edit documents and share files we recommend:
As with Jitsi, you can try it from Nextcloud's servers, but if you want to use it you would have to self host it or find a provider that would host it for you. If self hosting is not possible, we would instead recommend separate tools for different purposes.
If you are looking for an alternative to WeTransfer or Hightail try Tresorit Send - but bear in mind it is not open source.
Tutanota - a mail service that also offers an encrypted calendar
Working on documents together:
CryptPad (which can also be self hosted)
If you are thinking of a decentralised alternative to Google docs, forget about it. We would love to recommend one, but there isn’t one, really.
If you are looking for an independent way to manage projects and you are willing to invest in infrastructure and skills then we would recommend self hosted Gitlab which works for those with different levels of technical skills.
This is good for large teams, multiple projects where you need multiple assignments, task lists, commenting, issues or basic text editing. It’s great if you can self host it and you don’t mind learning a bit about how to use a tool that was created specifically for managing code. Or again, try Nextcloud with plug-ins such as “deck”.
When thinking about your network and what you might need or use for communication and collaboration, watch Julian Olivier's lecture from 2019, where he explains the process of setting up a robust and reliable environment for Extinction Rebellion called ‘Server Infrastructure for Global Rebellion’.
It is essential that when you are facilitating collaboration and proposing the use of certain tools for others that you consider certain factors. First, not everyone is equal on the internet. The person you are communicating with might have very limited access to the internet, or it might be very expensive or have limits on the amount of data transfers (video transfer eats lots of data). Second, they might also be using outdated or insecure devices, or they might be sharing them with others. Third, their technological, social, economic and political environment might be much less reliable, safe and predictable. The fact that they are communicating with you might be exposing their vulnerabilities, status, views, beliefs or associations, which might put them at serious risk. Just because you can do something does not mean others should follow. Think carefully about the choices you are making for others when you invite them to participate in your processes.
For more details, email us to request Tactical Tech’s upcoming research report ‘Changing Worlds’ (ttc at tacticaltech dot org).
If you are online, you are being tracked
Even before you switch on your favourite apps, you are being tracked. This is how mobile telephony has to work in order to provide you services. Mobile telephony works because our phones are constantly pinpointed by mobile providers to secure the best signal, so that we can make calls and get data as we move around. This is enough information to fairly precisely determine our whereabouts, and this data can be turned into behavioural data – for example where we live, how we commute, who we interact with and what we are interested in.
Tactical Tech made an animation some time ago illustrating how much one could learn from data acquired from a mobile phone.
Why is it important now? We see that in many places telecommunications companies are using and sharing mobile data with governments and public institutions to track, monitor and isolate the spread of the coronavirus, but also in order to see how much of the population is staying at home and to ensure they stay there. Daily news stories are emerging about this across the globe, such as Austria and Germany. In other places governments are partnering with intelligence companies, such as Palantir in the UK. Or they use tools produced by such companies as NSO, known for supporting intelligence gathering and hacking of citizens.
The Cambridge Analytica scandal exposed the use of targeted advertising and psychological profiling – two methods that had been already been employed for years, but known only by those making money from them. The use of big (geolocation) data for tracking and containing the spread of coronavirus will hopefully expose something that again has always been there – that mobile phones are trackers. This data gets even richer and more precise if your phone is a bit smarter and enables network locations and GPS.
In the context of sharing big data, we often hear another magical word: 'anonymisation'. Anonymisation is a process that detaches data from the identity of the user. This means that you may be able to see my data – for example, what I like to eat or how often I take a taxi – but you do not know who I am. Unfortunately, anonymisation is very similar to the concept of 'security'. Both are aspirations rather than achievable permanent states. Security is an ongoing process that needs to be maintained. Anonymisation is a process that gives users and the companies who collect their data some protection. However, the process can often be undone if you have enough patience and enough data sets at hand. There are many examples of data sets being released from services into the public and then being deanonymised and reidentified to individuals, often revealing compromising information about them (see here).
Tactical Tech made a short animation about how anonymisation works.
Collective problems require collective actions
Civil society does not have robust, privacy respecting tools because hardly anybody invests in such tools – neither investors, nor governments nor private funders. Paradoxically, these actors are often the first to ask for recommendations and complain that the tools recommended don’t really satisfy their expectations – and they never will.
Technology is fundamental to individuals, organisations, civil society at large and the public sphere. It is truly tragic that we have to rely on the private sector alone, one whose primary decision-making tool is profit. Currently, the mainstream tools and platforms are predominantly US-based. The main alternatives come from countries like China (e.g. TikTok, now with a global reach) or Russia (Yandex, the regional search engine). Most of these solutions are based on the exploitation of personal data.
Technology costs money – lots of money – not only to create tools but more importantly to maintain them. The infrastructure required for them to function is even more expensive. And even more critical are the skills needed by those who create and administer these tools. If we want organisations to rely on durable and reliable solutions, they need to have access to the know-how and the resources to either self host services they need or to have alternative options that work at scale. Currently, neither is in place.
It is no longer enough to join the camp of those who don’t use Facebook, avoid Google or even install custom ROMs (operating systems) on their mobiles, or rely exclusively on open source (to be honest, some of us at Tactical Tech do that, too). What is necessary is collective action – not one that tries to go back to the old days before data became a billion-dollar asset, but rather to go forward. This is an important time to learn what is wrong with the options we have and what impact these practices have on different sectors (like education), on communities who need them and on society at large. It is an excellent time to re-evaluate the tools we need and the reasons we need them and to be innovative about what creating viable alternatives that work in the public interest.
Fighting a pandemic requires collaboration, solidarity and networking. It requires good science and excellent decision making. All these responses need to be supported by technology – or exploited by it. It is time to decide which side you are on.
Most of this text focuses on individual and small group needs and technologies that they can use in a time of heightened remote working. It is important to mention that the problems we have outlined in this text are even bigger when it comes to larger institutions and organisations: foundations, universities, schools, hospitals, aid agencies, formal and informal networks. All of these larger entities also have to move overnight to virtual modes of operation. They are now making crucial decisions about platforms, providers, services, tools and apps without the proper time for training how to use them and often without extra funding. They have to look at the resources they have, their obligations, expectations and demands. It is extremely difficult to operate under such conditions. And again efficiency matters, often more than other things. Institutionally – as much as personally – it is hard to imagine a world in which, for example Microsoft, Google and Zoom are not facilitating schools. These services definitely make it more efficient, cheaper and easier for schools to become virtual, but they bring with them all the same problems we have been discussing about large-scale platforms and the trade-offs. Imagining alternative technologies requires imagining alternative techno-politics (such as who gathers data, how, and who benefits from it) and alternative business models – what would happen if tax contributions were used to create public good technologies?.
If you want to read more about the problems with technofixes, try this essay, Efficiency And Madness.
We are not doing remote work or remote education as if everything is fine. We’re trying to work, study and carry on with important things in the midst of an enormous health crisis. Nobody knows how long this will last, but we do know it will expand to other aspects of our lives and it will change the way we do things in the future – domestically, professionally, socially, economically, politically, and so on. The decisions made by institutions, organisations, governments and individuals now will define how we work in the future.
We have not yet seen tools developed with a Silicon Valley mindset that are fit for this kind of purpose. Let’s not allow the immediacy of coronavirus to blind us to the long-term consequences by locking us into technology that doesn’t work for the needs of civil society.
The sky is clear and birds are louder than ever
Let's also use this time to reflect and rethink. This abrupt slow-down, lock-down and separation will likely go on for a long time in different places. We don’t yet know how bad it will become. In some cases, it will enable authoritarian regimes and aligned businesses to grab more power. However, it could also be a moment of clarity, recognition, solidarity and collaboration, so that when we emerge from this moment there will be plenty do to. Maybe now we have a good reason to make the time, invest the money and find the support to do it.
Technology is stupid, your smart phone is stupid too, artificial intelligence is stupid as well. The question is: how smart are we?
written at the end of March 2020, Berlin
For further reading and resources start here:
As the Holistic Security guide states: “Security is a deeply personal, subjective and gendered concept. When we work to bring about positive social change, we can face persistent threats and attacks which impact upon our physical and psychological integrity, and often affect our friends and families.” However, taking an organised approach to security can help us to sustain ourselves and our work; you can find more here and amongst the broader holistic security community.
Gender and Technology
Online collaboration, interaction and communication is particularly risky to women, minority groups and people exercising their fundamental freedoms in restricted, non-democratic environments. Through a series of camps and collaborations, Tactical Tech previously co-created a series of tactics, methods and how-to's, which is now maintained by a broader community.
Here you can find a wiki and a training curricula where you can learn things like:
- Autonomous and Ethical Web Hosting Services
- Choosing a Service Provider
- Feminist Communication Strategy
- Hacking Hate Speech
- How the Internet works
... and over 20 other topics
In case you are developing tech and made it to this point in the text...
If you are a tech developer, system designer, engineer or somebody thinking of a start-up, running a hackathon or putting your skills to the test, please read the Critical Engineering Manifesto – it is as relevant as ever – and good luck with coding your way out.
Also see how others do it:
For example, the Guardian Project
Or see what projects are supported by the Open Technology Fund - if you don’t mind they are a US Congress-funded donor (we don’t mind, as the tools they support are open source, localised and audited among other things). Sadly they are an outlier, nobody else funds so many important open source projects, localisation work and software audits.
More information and research
We highly recommend The Syllabus
Privacy, Security, Safety, Community Guides and Resources
Totem - Digital Security Training for Journalists and Activists
Some basic software alternatives
Place to start - Alternative App Centre at the Data Detox Kit
If you are an Android user, try F-Droid instead of Google Play
Explore Free Software Foundations recommendations if you want to go full Free Software
Support for alternative free/open source tech
The Open Source Observatory and Repository (OSOR)
Self hosting basics
Guide: What the heck is self-hosting? (General Intro)
Awesome-Selfhosted - extensive list of software (free and open source)
Infrastructure provider’s features cheat-sheet (pdf)
Autonomous and Ethical Hosting Providers Sample list (pdf)
Following the case of ZOOM
if you are still using or promoting the use of Zoom - please read these articles -
Acknowledgements: thanks for comments, reviews, suggestions and edits - Alexander Ockenden, Christy Lange, Danja Vasiliev, Jacopo Anderlini, Laura Ranca, Manuel Beltrán, Wael Eskandar and in particular big thanks go to Stephanie Hankey
this text is published under Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)
all pictures by the author
Our Executive Director Stephanie Hankey writes for Project Syndicate about how we use systems driven by big data in crises such as the coronavirus pandemic.
Cet article aborde les questions de savoir quelle technologie est bonne, sûre et appropriée à utiliser en ces temps complexes si nous voulons agir et travailler de manière responsable et à distance. Comment décider de la technologie à laquelle nous devons faire confiance ? Il examine également ce qui pourrait être fait à l'avenir pour répondre à cette question beaucoup plus facilement qu'aujourd'hui.
This paper looks at shrinking civic space in terms of the digital, in particular the role that digital technologies can have on restricting the spaces of civil society organisations and their activities.
Caroline Sinders explains how machine learning is already changing product design and software and how this might impact ethics and the agency of humans.
Kate J. Sim examines how the prevailing cultural notion of sexual assault survivors as liars, and the widely held belief of technological objectivity, converge to instruct the design of anti-rape technologies.
Weaponised design is a process that allows for harm of users within the defined bounds of a designed system. This article takes a look at how it can be faciliated by designers who are oblivious to the politics of digital infrastructure or consider their design practice output to be apolitical.
An essay exploring the concept of technofixes - the use of data and technology to solve social, environmental and political problems.
This article examines a few moments related to the history of homosexuality and its categorisation. It starts with recent facial recognition algorithms to distinguish straight and gay faces and ends with Alan Turing’s questions about gender and The Imitation Game.