User:Novem Linguae/Essays/Docker tutorial for Windows (WSL)

My notes on how to get a local development environment up and running for MediaWiki development. Written from a Windows and VS Code perspective.

Local development environments are essential for the patch writing process. They allow you to instantly test your changes, before submitting your patch. They are also essential for the debugging process, since it allows you to step debug the issue if it's reproducible.

🔎TODO: I'm currently running things in like 4 different containers (PowerShell, Ubuntu, MediaWiki Docker, Fresh), and switching between the consoles. Should probably get absolutely everything running in 1 container such as Fresh, then rewrite these directions. Would simplify things.

Things to do at the start of every session

 * Fire up your 2 VS Code windows (1 for MediaWiki Core, 1 for the extension you're working on)
 * Activate XDebug for MediaWiki Core
 * GitHub change to Gerrit.ps1
 * GitHub change to Gerrit.ps1
 * GitHub change to Gerrit.ps1

Some pessimistic advice
Expect to spend more time setting up your dev environment than you do coding, until you've got it set up perfectly on all your computers, and you've mastered the ins and out of this work instruction. Can take months to become fluent. MediaWiki has a complicated toolchain.

Windows Subsystem for Linux (WSL)
This is the Windows Subsystem for Linux (WSL) version of this work instruction. The no-WSL version is located at User:Novem Linguae/Essays/Docker tutorial for Windows (no-WSL).

Why use WSL?


 * Advantages
 * Not using WSL, some pages can take 25 seconds to load (barely usable). Using WSL can get you down to 3 seconds (normal, much better).
 * Disadvantages
 * Can't keep files in Dropbox anymore.
 * More complicated to set up.

Docker
Docker is a fancy XAMPP. It lets whatever codebase you're working on pick what OS, what version of PHP/Python/Node, what database, etc. to use instead of depending on whatever version of XAMPP you happened to install. Then it automates the installation of everything for you.

If you try to use PHP 8.1 with a repo that is using a bunch of PHP 7.4 dependencies, for example, you may not be able to get a dev environment up and running, even if you do  instead of. You'll get a bunch of errors. You'd be forced to uninstall XAMPP 8.1 and install XAMPP 7.4, which is a pain. Maybe you need XAMPP 8.1 for your other project, so would have to do this all over again when switching projects. Docker automates all this.

Install WSL

 * In PowerShell...
 * When prompted, enter a username such as
 * When prompted, enter a password
 * When prompted, retype your password
 * install Docker Desktop for Windows
 * Docker -> Settings -> General -> tick "Use the WSL 2 based engine"
 * Docker -> Settings -> Resources -> WSL Integration -> tick "Ubuntu"
 * install Docker Desktop for Windows
 * Docker -> Settings -> General -> tick "Use the WSL 2 based engine"
 * Docker -> Settings -> Resources -> WSL Integration -> tick "Ubuntu"

Install useful software (composer, git, etc.)

 * Update the operating system
 * Make sure Docker is not running. Else it will have trouble in the next step when trying to modify the running program mysql.
 * Install common dev programs that aren't already installed such as git-review and composer
 * Configure git and git-review
 * Configure npm (for running unit tests and downloading JS packages). If you don't install it now,  will run the Windows version instead of the Ubuntu version and corrupt a bunch of stuff.
 * restart bash
 * - installs Node version 18, which is what is currently used by Wikimedia
 * In general, anytime you touch anything in, you'll want to use  , to avoid some nasty situations that can arise when you use the wrong PHP version. Your Ubuntu's PHP version may not be the same PHP version that is running in the Docker container.
 * Most console commands from now on in the rest of the tutorial will be done from within WSL (type  in PowerShell to access a Ubuntu shell) unless otherwise noted.
 * restart bash
 * - installs Node version 18, which is what is currently used by Wikimedia
 * In general, anytime you touch anything in, you'll want to use  , to avoid some nasty situations that can arise when you use the wrong PHP version. Your Ubuntu's PHP version may not be the same PHP version that is running in the Docker container.
 * Most console commands from now on in the rest of the tutorial will be done from within WSL (type  in PowerShell to access a Ubuntu shell) unless otherwise noted.

Eliminate password prompts

 * Get git and git review to stop asking you for your password until you close the window:
 * Add to ~/.profile:

Automatically

 * Place this file in your WSL home directory (one level up from the mediawiki folder): https://github.com/NovemLinguae/WikipediaMiscellaneous/blob/master/bash/mediawiki-from-scratch.sh

Consider wiping out your localhost and installing fresh via this script once a week, and/or when you get unexpected exceptions. Unexpected exceptions are often from alpha versions getting out of sync. For example, maybe you  MediaWiki core to be this week's version, but you forget to   skins/Vector, leaving you on last week's skins/Vector, which is incompatible.

Install MediaWiki core (1)

 * Set up your SSH keys in Ubuntu. You can generate new ones, or copy them over from Windows.
 * If you copy them over from Windows, they need to go from the C:\Users\NovemLinguae\.ssh\ directory to the /home/novemlinguae/.ssh/ directory.
 * You also need to make sure to set the private key file's permissions to 0600.
 * - replace "novemlinguae" with your Gerrit username
 * create .env file. This is similar to the .env file provided at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md, with a couple tweaks to make XDebug work, set the correct UID/GID for Windows, make PHPUnit throw less notices, etc.
 * create .env file. This is similar to the .env file provided at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md, with a couple tweaks to make XDebug work, set the correct UID/GID for Windows, make PHPUnit throw less notices, etc.

Install MediaWiki core (2)

 * follow the official instructions at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md
 * - does initial configuration and database creation. assumes sqlite. if you already have a LocalSettings.php file and want to install mariadb, see below.
 * VERY IMPORTANT FOR WINDOWS USERS:
 * - does initial configuration and database creation. assumes sqlite. if you already have a LocalSettings.php file and want to install mariadb, see below.
 * VERY IMPORTANT FOR WINDOWS USERS:

Automatically

 * Place this file in your WSL home directory (one level up from the mediawiki folder): https://github.com/NovemLinguae/WikipediaMiscellaneous/blob/master/bash/install-extension.sh

Manually

 * or
 * foreach (skin/extension):
 * - replace "novemlinguae" with your Gerrit username, and replace "PageTriage" with the extension name
 * (or whatever the name is)
 * add ,  ,  or similar to LocalSettings.php
 * create .vscode/settings.json (and populate it with the text in the section below)
 * - does database updates for skins and extensions
 * add ,  ,  or similar to LocalSettings.php
 * create .vscode/settings.json (and populate it with the text in the section below)
 * - does database updates for skins and extensions

Adiutor

 * install Echo
 * install BetaFeatures
 * - this will create 7 .json pages onwiki. Check Special:RecentChanges to see them.
 * Special:Preferences -> beta features -> tick "Adiutor"
 * Special:Preferences -> moderation -> tick all
 * Special:AdiutorSettings
 * When I tried this, I was getting a blank page, with a JS error in the console. Did I not update a dependency?Try again someday.

CentralAuth

 * install mw:Extension:AntiSpoof. mandatory dependency
 * test here if you want, to make sure AntiSpoof is working: http://localhost:8080/
 * add to config:
 * install mw:Extension:CentralAuth, skipping the  step.
 * log into HeidiSQL as root
 * create database named centralauth
 * Tools -> User manager -> my_user -> Add object -> centralauth
 * Tick the check box, granting access to all
 * Save
 * take a backup of the Extension:AntiSpoof table (counter-intuitively named ). then upload that table to the new centralauth database
 * - use mysql for mariadb, sqlite for sqlite
 * Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.
 * - use mysql for mariadb, sqlite for sqlite
 * Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.
 * Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.
 * Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.
 * Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.

DiscussionTools

 * install dependencies
 * Linter
 * Echo
 * VisualEditor
 * install as normal
 * LocalSettings.php

FlaggedRevs
FlaggedRevs is packed with features, so it is important to get its settings right, so that it behaves the way you expect it to. It is basically like two extensions in one and has two major modes: override and protection.


 * override - requires all pages to go through review before displaying that revision to a logged out user
 * protection - requires protected pages go through review before displaying that revision to a logged out user

ruwiki (override = false, protection = false)
Note: You also need to add yourself to the "editor" group to review pages.

ORES

 * Add this to LocalSettings.php:
 * ORES fails to fetch data for some of the earlier revisions in the enwiki database, to circumvent this, you could use the selenium tests to create a bunch of articles and revisions in the DB.
 * NOTE: Some MediaWiki documentation will mention the  variable that controls how many jobs get executed. By default this variable is set to 0. Setting this however, is not necessary since Mediawiki-Docker provides a seperate container that periodically runs accumulated jobs.
 * NOTE: Some MediaWiki documentation will mention the  variable that controls how many jobs get executed. By default this variable is set to 0. Setting this however, is not necessary since Mediawiki-Docker provides a seperate container that periodically runs accumulated jobs.

PageTriage
Config settings:

ProofreadPage

 * ProofreadPage is the extension that provides much of the backend functionality for the Wikisource projects.
 * A guide exists at soda/mediawiki-scripts/ProofreadPage.md on Wikimedia Gitlab

Scribunto (Modules, Lua)

 * says there's extra steps, but works out of the box for me

SyntaxHighlight

 * Careful when git cloning. The extension is actually named

VisualEditor

 * install as normal
 * - this git clones the lib/ve repo into a subdirectory, so that visual editor works on your localhost wiki. do not edit these files though. see below.
 * there are two repos:
 * mediawiki/extensions/VisualEditor
 * VisualEditor/VisualEditor - the contents of the lib/ve folder. patches for this repo need to be done completely separately.  it into its own folder completely outside of /mediawiki/ when you work on it and submit patches for it.
 * VisualEditor/VisualEditor - the contents of the lib/ve folder. patches for this repo need to be done completely separately.  it into its own folder completely outside of /mediawiki/ when you work on it and submit patches for it.

Wikibase (Wikidata)

 * Wikibase Repository and Wikibase Client have separate pages on MediaWiki wiki, but they are both located in a repo named Wikibase.
 * The repo is divided into a couple different sub-repos, contained in folders in the main repo
 * client
 * lib
 * repo

First time

 * - This opens VS Code inside WSL
 * Go to your list of extensions. Filter by installed. They are installed in Windows but not WSL yet. You'll need to click a blue button ("Install in WSL: Ubuntu") to reinstall most of them.
 * Go to your list of extensions. Filter by installed. They are installed in Windows but not WSL yet. You'll need to click a blue button ("Install in WSL: Ubuntu") to reinstall most of them.

Window #1 - Open the mediawiki folder in VS Code

 * - This opens VS Code inside WSL
 * In the future, this will show up in File -> Open Recent, so you can quickly open it.
 * - This opens VS Code inside WSL
 * In the future, this will show up in File -> Open Recent, so you can quickly open it.

Window #2 - Open the extension folder in VS Code

 * If you're working on a MediaWiki extension or skin, open two windows: one for MediaWiki core, and one for the extension you're working on.
 * Run your step debugger in the MediaWiki core window (including setting breakpoints)
 * Do your coding work in the extension window. This will give you "search within repo", git, etc.
 * - This opens VS Code inside WSL
 * Add this to your extension, in a file called, so that MediaWiki core's libraries get imported and detected by PHP IntelliSense:
 * - This opens VS Code inside WSL
 * Add this to your extension, in a file called, so that MediaWiki core's libraries get imported and detected by PHP IntelliSense:
 * Add this to your extension, in a file called, so that MediaWiki core's libraries get imported and detected by PHP IntelliSense:

Linters

 * Sniffer Mode - onType
 * JavaScript linting - I use the VS Code extension "ESLint". It works out of the box.
 * PHP linting
 * PHP Sniffer - for detecting sniffs
 * settings -> tick "auto detect"
 * phpcbf - for fixing sniffs
 * to use it, right click -> format document

Other extensions

 * Git Blame
 * GitHub Pull Requests
 * PHP Debug
 * PHP Intelephense
 * Sort lines
 * WSL

PHP step debugging: XDebug

 * Always run XDebug from the /mediawiki/ directory, not from an extension directory. According to the documentation, this is mandatory.
 * Make sure VS Code has WSL and PHP Debug extensions installed.
 * Add this to your .env file:


 * If you like XDebug's feature of providing big orange errors/warnings/stack traces, include  in your.
 * Replace your launch.json with the below. The  and   parts are very important for getting XDebug to work inside WSL.

JavaScript step debugging: Google Chrome devtools

 * TODO: see if I can get this working in VS Code instead
 * If you're having trouble setting a breakpoint (for example, the code you need is minified by ResourceLoader), add  to your code.
 * if you're having trouble with minification or caching (15 minutes), add ?debug=1 to the URL

Vue debugging: Vue devtools (browser extension)

 * https://devtools.vuejs.org/guide/installation.html

Running tests
How to run an extension's tests:

PHPUnit

 * First time:
 * Add this to your .env file
 * to get PHPUnit to stop outputting detailed debugging (recommended, else your unit test output is really noisy):
 * to get PHPUnit to use your actual database instead of a TEMPORARY database, so that you can peek at the tables when you step debug:
 * Core
 * - all
 * Folder/type
 * - tests in the /unit/ subfolder only
 * - tests in the /integration/ subfolder only
 * Extensions and skins
 * - an extension's tests only
 * Specific file
 * - a specific test file only
 * Specific test
 * @group
 * 🔎(todo)
 * Debugging CI
 * https://gerrit.wikimedia.org/r/c/mediawiki/extensions/PageTriage/+/997634
 * Debugging CI
 * https://gerrit.wikimedia.org/r/c/mediawiki/extensions/PageTriage/+/997634

Jest

 * First time - install nvm (node version manager) so you can switch to the correct version of node used by Wikimedia
 * restart bash
 * - installs Node version 18, which is what is currently used by Wikimedia
 * - does linting too
 * - does tests (for this extension only) and code coverage
 * - shows console.log output, in case you want to spy on a variable
 * - run a single test file
 * - if a code coverage report is on by default in your repo, this silences it
 * - this will regenerate snapshots for your snapshot tests
 * non-Mediawiki repos: generate HTML coverage reports using
 * - this will regenerate snapshots for your snapshot tests
 * non-Mediawiki repos: generate HTML coverage reports using

QUnit

 * Note that QUnit tests will run in CI even if they are not set up at  or in package.json. One of the CI test entry points is Special:JavaScriptTest. In fact,   is only supposed to be linters, no tests.
 * how to run the tests
 * in bash
 * - standard way to do it according to mw:Manual:JavaScript unit testing
 * - jquery-client repo. uses  to load karma, which loads QUnit
 * via web interface
 * add  to LocalSettings.php
 * then visit Special:JavaScriptTest

Selenium

 * install Fresh
 * check your version of node:
 * download the file corresponding to your version of node from here
 * rename the file to end in .sh
 * move the file into WSL, at a directory such as
 * within a WSL console, . fresh is basically a fancy docker container for Selenium. Running this will get Docker running.
 * 🔎https://www.mediawiki.org/wiki/Selenium/Getting_Started/Run_tests_using_Fresh
 * 🔎or maybe just do the normal ?
 * how to find and fix flaky Selenium tests: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/PageTriage/+/993696

Parser tests

 * All
 * Specific extension
 * Specific extension

Code coverage
How to generate code coverage reports:
 * PHPUnit
 * In your .env file, XDEBUG_MODE must include "coverage". Example: . Restart your mediawiki docker after changing this.
 * Open the file mediawiki/tests/phpunit/suite.xml. Replace the section with something similar to the following. You need to specify every extension file and directory you want checked, and you need to delete all the mediawiki directory folders.

PHP

 * First time:
 * For core, this will take around 10 minutes.
 * The configuration file is located at maintenance/Doxyfile in core, Doxyfile (root directory) everywhere else
 * mw:Manual:Mwdocgen.php
 * mw:Manual:Coding conventions/PHP
 * Requires extra steps to get it published to doc.wikimedia.org. Including getting the permission of the maintainers, adding it to integration/config, and adding it to the homepage of doc.wikimedia.org.
 * mw:Manual:Mwdocgen.php
 * mw:Manual:Coding conventions/PHP
 * Requires extra steps to get it published to doc.wikimedia.org. Including getting the permission of the maintainers, adding it to integration/config, and adding it to the homepage of doc.wikimedia.org.

JavaScript

 * - Generates documentation in the /docs/js/ folder. Navigate to /docs/js/index.html to view.
 * The configuration file is located at jsdoc.json or .jsdoc.json
 * JSDoc
 * Requires extra steps to get it published to doc.wikimedia.org. Including getting the permission of the maintainers, adding it to integration/config, and adding it to the homepage of doc.wikimedia.org.
 * JS documentation does not ride the train. It publishes instantly. doc.wikimedia.org has a 1 hour server-side cache so it may be 1 hour before changes show up.

Running maintenance scripts

 * core
 * will run maintenance/showSiteStats.php
 * extension
 * will run extensions/Adiutor/maintenance/updateConfiguration.php

SQL database

 * how to install the database if you already have a LocalSettings.php file with correct database connection info, and a created database
 * harder than it should be. I've created a ticket. But in the meantime...
 * 🔎go into HeidiSQL, delete all the tables
 * rename your LocalSettings.php file to something else
 * re-run, with all the correct CLI parameters
 * delete LocalSettings.php
 * rename your old LocalSettings.php back to LocalSettings.php
 * how to update the database (installs SQL tables for extensions)
 * how to drop all tables on a MariaDB
 * 🔎install HeidiSQL
 * drop the tables
 * drop the tables

SQLite or MariaDB?

 * SQLite is the default. Pros and cons:
 * Pro - Keep your localhost database synced between computers, e.g. desktop and laptop, because the database is stored in the docker container in the /cache/ directory.
 * Pro - Easily clear the database by simply deleting the /cache/ directory.
 * Pro - Easy to set up a database viewer and editor, since you just need to point it to /cache/sqlite/my_wiki.sqlite
 * Con - Different than Wikimedia production, which uses MariaDB
 * Con - Subtle bugs such as using raw SQL instead of $this->db-expr can break tests and break the extension in general. In Gerrit,  can be used to test for some of this.
 * Con - JSherman (WMF) says he's had problems with DeferredUpdate and job queue behavior in SQLite.
 * MariaDB is an alternative, and removes an entire class of possible bugs since it is much closer to how Wikimedia production is set up. How to set it up:
 * MediaWiki-Docker/Configuration recipes/Alternative databases
 * follow this exactly. don't forget both the docker-compose.override.yml step and the maintenance/install.php step (copy pasting their custom string)
 * re-run

Viewing and modifying the database: HeidiSQL

 * to view/edit the SQL database, install HeidiSQL (download page)
 * sqlite
 * 🔎point HeidiSQL at mediawiki/cache/sqlite
 * mariadb
 * make sure your docker-compose/override.yml file has the following:
 * configure HeidiSQL with the settings in docker-compose.override.yml
 * root
 * hostname = localhost
 * username = root
 * password = root_password
 * or a specific database
 * hostname = localhost
 * username = my_username
 * password = my_password
 * database = my_database
 * I couldn't figure out how to shell into the database, so use HeidiSQL logged in as root for creating databases, editing users, etc.

LocalSettings.php

 * To get uploading working...
 * LocalSettings.php:
 * bash:
 * Then visit Special:Upload

Miscellaneous

 * File sizes
 * MediaWiki + skin + extension files is around 1.1 GB
 * Docker files are around ?? GB
 * how to remote into Docker so that you don't have to add  to the start of every command, and so that you can   around more easily
 * how to run an extension's maintenance script
 * restarts
 * any changes to the .env file require a restart of the Docker container:
 * restarts
 * any changes to the .env file require a restart of the Docker container:
 * any changes to the .env file require a restart of the Docker container:

Troubleshooting

 * PHP errors when loading the wiki in a browser, after taking a break for a couple weeks and then doing  on core or one extension
 * Update core, all extension, and all skins with,  , and.
 * Comment out the extensions and skins you're not using in LocalSettings.php, so you have less extensions and skins to update.
 * Don't forget to update vector. This is often forgotten and is often the source of the problem.
 * Container mediawiki-mariadb-1: Error response from daemon: Ports are not available: exposing port TCP 0.0.0.0:3306 -> 0.0.0.0:0: listen tcp 0.0.0.0:3306: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
 * Are you also running XAMPP? Close XAMPP, then go into Task Manager and terminate mysqld.exe.
 * error during connect: This error may indicate that the docker daemon is not running.: Get " http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.24/containers/json?all=1&filters=%7B%22label%22%3A%7B%22com.docker.compose.project%3Dmediawiki%22%3Atrue%7D%7D&limit=0 ": open //./pipe/docker_engine: The system cannot find the file specified.
 * Start Docker Desktop, then try your CLI command again.
 * There's a bunch of files in WSL that end in .dropbox.attrs
 * Delete them with ,
 * fatal: fsync error on '//wsl.localhost/Ubuntu/home/novemlinguae/mediawiki/extensions/AbuseFilter/.git/objects/pack/tmp_idx_83ZVF3': Bad file descriptor. fatal: fetch-pack: invalid index-pack output
 * Did you  in PowerShell instead of WSL by accident? Need to   from within WSL.
 * npm ERR! code EUSAGE. The `npm ci` command can only install with an existing package-lock.json or npm-shrinkwrap.json with lockfileVersion >= 1. Run an install with npm@5 or later to generate a package-lock.json file, then try again.
 * Did you  in PowerShell instead of WSL by accident? Need to   from within WSL.
 * sh: 1: phpunit: Permission denied
 * cmd.exe was started with the above path as the current directory. unc paths are not supported
 * You're trying to run npm/Jest in Ubuntu, but npm is not installed in Ubuntu, so it is using the Windows version. The fix is to install the Ubuntu version. See the Unit Test -> Jest section above.
 * sh: 1: eslint: Permission denied
 * Your npm packages are corrupted. Did you install them using npm for Windows instead of npm for Ubuntu by accident? The fix is to install the Ubuntu version. See the Unit Test -> Jest section above. Then  to repair your packages.
 * Error: Class "ResourceLoaderSkinModule" not found
 * Update your skins
 * Special:NewPagesFeed / pagetriagelist API query times out
 * Change the filters it is using. The combination of filters you're using is buggy. T356833
 * Notice: Did not find alias for special page 'NewPagesFeed'. Perhaps no aliases are defined for it?
 * Restart your Docker container.
 * gives a "divergent branches" error
 * VS Code: Failed to save 'X': Unable to write file 'Y' (NoPermissions (FileSystemError): Error: EACCES: permission denied, open 'Z')
 * Some of your files are owned by "root" instead of "novemlinguae". Fix with...
 * Windows system gets laggy. Vmmem consumes huge amount of memory in task manager (8-14 GB)
 * WSL has a memory leak
 * When Docker pops up a window that WSL has crashed, click Restart
 * This repo has npm pre-commit installed, and VS Code's "commit" button doesn't work well for some reason. Do  in bash instead. Should work there.
 * After I build the assets, I noticed differences in the contents to what you committed. Try running `npm run build` again or removing the node_modules folder and running npm install with the correct node version.
 * This repo has npm pre-commit installed, and you forgot to run  before committing. Run , then commit again.
 * ssh: Could not resolve hostname gerrit.wikimedia.org: Temporary failure in name resolution
 * Turn off your VPN
 * This repo has npm pre-commit installed, and VS Code's "commit" button doesn't work well for some reason. Do  in bash instead. Should work there.
 * After I build the assets, I noticed differences in the contents to what you committed. Try running `npm run build` again or removing the node_modules folder and running npm install with the correct node version.
 * This repo has npm pre-commit installed, and you forgot to run  before committing. Run , then commit again.
 * ssh: Could not resolve hostname gerrit.wikimedia.org: Temporary failure in name resolution
 * Turn off your VPN
 * Turn off your VPN