Cybersec 101: How to Protect Yourself Online

By Bojan Perkov, 9th April 2021

This post was co-authored by Bojan Perkov and Andrej Petrovski.

Databases, servers, algorithms – these things not only make up so many things in our world and make them work, but also have more and more direct influence on our lives.

Who gets a job and who gets fired, who gets state aid or which child goes to which school are just some of the questions we are increasingly relying on technology, in its various forms, to answer. 

With the emergence of the COVID-19 pandemic, millions of 9-to-5 office employees around the world suddenly became remote workers, on endless video-calls from their living rooms.

If there was a time when our “offline” and “online” selves finally became one, it was no doubt during the pandemic. Ever-extending lockdowns, travel restrictions and social distancing made us turn to glowing-screen devices and online services as our safest chance to experience the world and things we hold dear.

If there was a time when our “offline” and “online” selves finally became one, it was no doubt during the pandemic.

Rising cybercrime

In these circumstances we need to protect our digital identities and assets even more, especially with the increase of cybercrime due to the pandemic.

Using vulnerable home devices with outdated operating systems, software and firmware, and working on public WiFi networks without a VPN are great risk factors exposing people to various technical attacks.

Ransomware, phishing and smishing may seem like things that happen to other people, but one mistake can prove very costly. If the adversary is a skilled malicious hacker group, a nation state or some other actor with advanced capabilities, the damage for organisations and individuals in terms of stolen or destroyed data can be beyond repair.

Ransomware, phishing and smishing may seem like things that happen to other people, but one mistake can prove very costly

Sensitive information

Journalists, human rights defenders (HRD) and other public interest watchdogs should particularly be wary of technical attacks on their digital infrastructure and assets, as they keep very sensitive information regarding the identity of sources or victims of abuse.

In a world of global organised crime, high-level corruption schemes and abuses of power, journalists and HRD rely more and more on information given to them by whistleblowers, people who could suffer serious damage in life if their identity is leaked. Therefore, public interest watchdogs must be able to protect the identity of their sources, especially in the digital realm. One of the ways this can be achieved is by anonymous document submission through platforms such as SecureDrop.

…public interest watchdogs must be able to protect the identity of their sources, especially in the digital realm

Staying safe

An organisation is as safe as its most vulnerable members, at every step of the digital process (occurring now in a domestic environment).

So, let’s examine some of these steps, starting from the beginning: access, or login credentials. A good and strong password is essential, or so goes the mantra that everybody has heard countless times, and yet bad passwords are still one of the most commonly used points of entry by malicious actors.

There’s no way we can memorise dozens of long, meaningless sets of characters – that we should also change from time to time – but the worst solution is to let the browser memorise them for us. We have password managers for that like KeePass, KeePassXC or Bitwarden, that also enable users to automatically generate unique, long and random passwords that are very hard to crack by guessing, dictionary attacks or brute force.

There’s no way we can memorise dozens of long, meaningless sets of characters, but the worst solution is to let the browser memorise them for us

Another strongly recommended move would be to turn on 2-step verification on every account we use. Also, an increasing segment of online traffic has already switched to virtual private networks, so why not start using a reliable VPN?

End-to-end encryption (E2EE) should be used on as many services as possible: email, chat, collaborative documents.

Switching to email providers which offer built-in E2EE, like ProtonMail or Tutanota, may seem too complicated and even expensive, but it offers much more protection compared to “free” services such as Gmail.

As for messaging apps, Signal received a lot of attention at the start of 2021, after the changes to WhatsApp’s data sharing practices were announced and users started ditching the Facebook-owned app for alternatives. Although it requires a valid phone number for registration, Signal provides E2EE chats by default, as well as group audio and video calls for up to 8 members.

When it comes to teamwork, Cryptpad offers an encrypted and open source online collaboration suite (storage drive, documents, spreadsheets, polls, presentations, etc.) which can also be self-hosted to accommodate the specific needs of a collective.

When we need files stored on hard drives encrypted, VeraCrypt is an excellent choice that supports various platforms. When drives are well encrypted, even if devices get stolen or tampered with, it is virtually impossible to access the files stored on them without the decryption password. 

A cybersecurity toolkit

SHARE Foundation has been providing free tech and legal support, as well as digital security training, to online media and civil society organisations since 2014, the same year we started monitoring cases of digital rights violations in Serbia.

During this time, we encountered numerous cases of technical attacks against journalists and activists, ranging from DDoS to malicious code injection attacks.

In 2020, SHARE partnered with Balkan Investigative Reporting Network (BIRN) to expand the monitoring process to 5 additional countries in Southeast Europe – Bosnia and Herzegovina, Croatia, Hungary, North Macedonia and Romania. Currently there are more than 1,500 cases in the umbrella database encompassing all six monitored countries.

Given that the problems with digital security are becoming more common among civil society and media organisations, SHARE Foundation developed an open platform called Cybersecurity Toolkit.

In times of scarcity and uncertainty, growing pressures and complexities of our daily lives, there are reliable technical solutions that can help us reduce digital risks

It provides one-stop instructions and possible solutions to problems with websites, applications or devices, but also offers guidance and knowledge of good practices in protecting information systems and all our digital goods. A feature that makes this Toolkit particularly important is that it’s focused on legal and practical counselling for victims of technology-based violence or harassment. 

In times of scarcity and uncertainty, growing pressures and complexities of our daily lives, there are reliable technical solutions that can help us reduce digital risks and focus instead on what’s really important, whatever that may be for each of us. Stay safe. 

Bojan Perkov is a Policy Researcher at the SHARE Foundation. Andrej Petrovski is Director of Tech.

One Year On: COVID-19 and Digital Rights

By Nani Jansen Reventlow, 7th April 2021

Last year, in the early days of the pandemic, we declared the COVID-19 pandemic a crisis for digital rights. Today on World Health Day, one year on, we’re asking: how has the situation evolved? Have our fears materialised, and has the coronavirus changed the digital rights landscape forever?

Last spring, digital rights challenges were spreading as quickly as the virus itself. Worrying trends, from “biosurveillance” measures to tracking apps, sprung up almost immediately, spurring digital rights activists into action.

Last spring, digital rights challenges were spreading as quickly as the virus itself

Back then, fears abounded about how intrusive surveillance measures or intensive data collection would proliferate in the coming months. One year on, not only have such measures become increasingly common: they’ve also become rapidly normalised across the globe. 

Mounting cases

In DFF’s own COVID-19 litigation fund, launched as an emergency support against potential digital rights violations, the issues tackled by grantees ranged from worrying COVID-19 tracking apps to increased surveillance of university students.

In the UK, Big Brother Watch plans to take a claim to the High Court against thermal scanners, which are now deployed in many public places, from schools to shops. The organisation is hoping for acknowledgement that the data garnered from these scanners is personal data, and that impact assessments must take place before these machines are used.

In Germany, on the other hand, Gesellschaft für Freiheitsrechte is taking litigation against health insurance providers, who are transferring the pseudonymised health data of millions of people over to institutions for research purposes. GFF fear, however, that the security standards for people’s sensitive health data is too weak, and that certain unique data could later be re-personalised.

A global problem

The same concerning measures recur again and again across the globe. 

With the worldwide vaccine rollout in full swing, many countries have begun to make use of “immunity passports”. Israel has put in place a “green pass” system allowing only those who are vaccinated or who have already recovered from COVID-19 to visit restaurants and attend events. China has launched a digital vaccination passport for those planning to cross borders, while in the UK, there are proposals to introduce vaccine certificates in order to gain entry to venues such as pubs.

Many airlines are also promising to make use of these passes to prevent travellers from spreading COVID-19, despite the fact that most of the world is unlikely to have access to the jab in 2021.

…these kinds of immunity passports give states huge powers to surveil, and the risk of mission creep is high.

But these kinds of immunity passports give states huge powers to surveil, and the risk of mission creep – the possibility of the collected data being repurposed in other contexts – is high. 

The concept of such passports is inherently discriminatory, creating as it does yet another system through which to exclude certain individuals. As always, this is likely to include already-marginalised groups who don’t have access to vaccines.

The all-seeing eye

Surveillance, in forms both old and new, has been deployed extensively throughout the pandemic. New ways to track people that might, in previous decades, have been widely acknowledged as eerily Orwellian now have the justification of protecting public health.

Israel, for example, is offering travellers arriving from abroad the chance to use an electronic bracelet and a wall-mounted tracker. The system is alerted when individuals take off the bracelet or goes too far from the tracker.

New ways to track people that might, in previous decades, have been widely acknowledged as eerily Orwellian now have the justification of protecting public health

Security risks and data insecurity are endemic with tracking apps. In Europe, the French contract tracing app was last year deemed not fully compliant with GDPR, while in other countries such as India, using a tracking app became de facto mandatory for many.

One year on

It’s a truism that a crisis is often needed for big societal shifts to happen, but in the case of the pandemic, the cliche rings all too true. Digital technology was quickly inserting itself into evermore areas of our lives even before COVID-19, but the last year has seen the pace of change reach light speed, propelled by panic about the spread of the virus.

Digital technology was quickly inserting itself into evermore areas of our lives even before COVID-19, but the last year has seen the pace of change reach light speed, propelled by panic

This normalisation of intrusive or discriminatory technologies should not go unchecked. Even with the pandemic stretching on longer than we might have initially hoped, we must continue to scrutinise all new measures, and ensure they don’t stick around once they’re no longer needed.

Photo by CDC from Pexels

Creating Conditions for a Decolonised Digital Rights Field

By Laurence Meyer, 31st March 2021

Interconnected network nodes against lights and blue background

This post was co-authored by Laurence Meyer and Sarah Chander.

During the DFF strategy meeting 2021, participants organised two sessions on “decolonising digital rights.”

A public panel discussion on “Decolonising Data” was also held on the sidelines of the strategy meeting, made possible largely because the strategy meeting happened online for the first time. Here are some reflections.

Since 2019, DFF and EDRi have been working to initiate a decolonising process for the digital rights field. Reflecting on the increased challenges to our digital rights, we realised how imperative it is that the field truly reflects everyone in European society. This means improving representation in the digital rights field, but more crucially undoing the power structures preventing us from protecting digital rights for everybody.

We discussed with participants of the DFF strategy meeting how to take this forward. In particular, in a session led by Roxanna Lorraine-Witt of Save Space e.V., we took a deep dive into how to take practical steps forward in the decolonising process. 

Making digital rights real 

Thinking back to our first decolonising digital rights gathering in December 2020, we realised how important it is to have a vision for change to work toward. 

One major aspect of this vision was to make digital rights real – for digital rights to be clear, tangible and relevant to everyone in society, not just to a privileged few. Part of this vision was for the digital rights field to be firmly situated in broader social justice fights and to be actively working with other movements. 

Hopefully, this will help us to realise more ambitious goals, like defunding surveillance tech that targets racialised communities and re-directing resources to communities, or ensuring everyone has access to technology on their own terms and in a manner that takes into account the environment we live in. 

Hopefully, this will help us to realise more ambitious goals, like defunding surveillance tech that targets racialised communities

Even this process of constructing goals with a wide range of actors – racial and social justice activists, technologists and digital rights organisations – was an attempt to make a small shift in the practice for the field. 

The process of how we might realise this vision of making digital rights real for all – that’s where decolonising comes in. 

The pre-conditions for change

Colonisation as a process and colonialism as a reality are about occupying space, displacing/replacing people and extracting resources. Colonialism is also very much about excluding some people from the benefit of rights even though we all share the same space.

Asking these questions is an essential pre-condition for change. There are no simple answers, both because they demand that we review and interrogate ways of doing and thinking that seem natural, and because the discussions that they require emotionally engage us. It requires extra work, extra care, extra patience, extra humility. 

It requires extra work, extra care, extra patience, extra humility. 

Approaching digital rights through the lens of decoloniality invites us to interrogate how digital space is occupied, the people who are displaced, and the mechanisms of extraction it requires to exist. We should ask these same questions of the digital rights field.  

During the strategy meeting, participants reflected on some of the barriers to decolonising that need to be questioned and addressed as a pre-condition for change within the field. 

First, participants noted the challenge in articulating and discussing decolonisation of the digital rights field.  Due to the magnitude and the essence of the work, finding the adequate wording to describe what a decolonised digital rights field looks like can be difficult. A lot of the language that will allow us to envision with more precision what is possible is still to be invented.

Many participants also outlined the disconnect between what are considered top priorities in the digital rights field as it stands and the practical issues that marginalised groups face on a daily basis.

For example, for many communities the primary digital rights issue is getting access to digital technology and/or electricity to be able to use it. 

Participants also observed that some digital rights organisations approach systemic injustices from a merely technical perspective, rather than situating these issues within the broader reality of how human rights harms are pervasive online and offline for certain groups and communities. This can lead to digital rights issues being viewed as mere technical issues with technical solutions, rather than harms that interrelate and intersect with other human rights violations. 

This can lead to digital rights issues being viewed as mere technical issues with technical solutions

The discussion also focused on the difficulties faced by grassroots movements in building constructive relationships with established organisations and institutions, due in part to a lack of a common language and understanding, as well as an imbalance in institutional power relationships.

This can reinforce the exclusion of many grassroots organisations from partnerships and proposals for funding, which could be overcome to some extent by more participatory structures of funding. 

Finally, we discussed the challenge of the emotional toll decolonising work can take on members of affected communities.  Many referenced the book “Pleasure activism: the politics of feeling good” from Adrienne Maree Brown as an inspiration for developing resilience tools.

…we discussed the challenge of the emotional toll decolonising work can take on members of affected communities.

The participants also agreed on the centrality of creating a community to cover collective needs and structures and processes to prevent exhaustion.

While we highlighted so many challenges to decolonising the digital rights field, the determination, enthusiasm and readiness to work toward concrete solutions was the overwhelming conclusion of the session.

What’s next?

In our next phase of the decolonising work, a collaborative design process that will lead to a plan for a multi-year decolonising process, we will attempt to foster the pre-conditions for change. In the design process, we will aim to build a decolonising community of people with different disciplines, areas of expertise and experience to collectively design a programme of activities to engender change in the field. 

Some of the pre-conditions we will need to establish include: mapping what needs to change, developing a common language and understanding, and putting in place the necessary structures to ensure this work will be sustainable. 

The determination to engage in a process of decolonising the digital rights field and the acknowledgement of the challenges it entails will be key to our success moving forward. It will require a frank acknowledgement of where the field is now, and where it needs to go to see meaningful and genuine shifts. We are excited to embark on the next phase. 

Laurence Meyer is Social and Racial Justice Lead at the Digital Freedom Fund. Sarah Chander is Senior Policy Advisor at EDRi (European Digital Rights).