### Step 2. Updating the `known_hosts` file of the secondary nodes
1. SSH into the **secondary** node and login as root:
In the following table you can see what all these settings mean:
```
sudo -i
```
| Setting | Description |
| --------- | ----------- |
| Primary | This marks a Geo Node as primary. There can be only one primary, make sure that you first add the primary node and then all the others. |
| URL | Your instance's full URL, in the same way it is configured in `gitlab.yml` (source based installations) or `/etc/gitlab/gitlab.rb` (omnibus installations). |
|Public Key | The SSH public key of the user that your GitLab instance runs on (unless changed, should be the user `git`). That means that you have to go in each Geo Node separately and create an SSH key pair. See the [SSH key creation][ssh-pair] section. |
1. The secondary nodes need to know the SSH fingerprint of the primary node that
will be used for the Git clone/fetch operations. In order to add it to the
`known_hosts` file, run the following command and type `yes` when asked:
## Secondary node GitLab setup
```
sudo -u git -H ssh git@<primary-node-url>
```
>**Note:**
The Geo nodes admin area (**Admin Area > Geo Nodes**) is not used when setting
up the secondary nodes. This is handled at the primary one.
Replace `<primary-node-url>` with the FQDN of the primary node.
To install a secondary node, you must follow the normal GitLab Enterprise
Edition installation, with some extra requirements:
1. Verify that the fingerprint was added by checking `known_hosts`:
- You should point your database connection to a [replicated instance](./database.md).
- Your secondary node should be allowed to [communicate via HTTP/HTTPS and
SSH with your primary node (make sure your firewall is not blocking that).
- Don't make any extra steps you would do for a normal new installation
- Don't setup any custom authentication (this will be handled by the `primary` node)
```
# Omnibus GitLab installations
cat /var/opt/gitlab/.ssh/known_hosts
You need to make sure you restored the database backup (that is part of setting
up replication) and that the primary node PostgreSQL instance is ready to
replicate data.
# Installations from source
cat /home/git/.ssh/known_hosts
```
### Database Encryption Key
### Step 3. Copying the database encryption key
GitLab stores a unique encryption key in disk that we use to safely store
sensitive data in the database.
sensitive data in the database. Any secondary node must have the
**exact same value** for `db_key_base` as defined in the primary one.
1. SSH into the **primary** node and login as root:
```
sudo -i
```
1. Find the value of `db_key_base` and copy it:
```
# Omnibus GitLab installations
cat /etc/gitlab/gitlab-secrets.json
# Installations from source
cat /home/git/gitlab/config/secrets.yml
```
1. SSH into the **secondary** node and login as root:
```
sudo -i
```
1. Open the secrets file and paste the value of `db_key_base` you copied in the
previous step:
```
# Omnibus GitLab installations
editor /etc/gitlab/gitlab-secrets.json
# Installations from source
editor /home/git/gitlab/config/secrets.yml
```
Any secondary node must have the **exact same value** for `db_key_base` as
defined in the primary one.
1. Save and close the file.
- For Omnibus installations it is stored at `/etc/gitlab/gitlab-secrets.json`.
- For installations from source it is stored at `/home/git/gitlab/config/secrets.yml`.
### Step 4. Enabling the secondary GitLab node
Find that key in the primary node and copy paste its value in the secondaries.
1. SSH into the **secondary** node and login as root:
### Enable the secondary GitLab instance
```
sudo -i
```
1. Create a new SSH key pair for the secondary node. Choose the default location
and leave the password blank by hitting 'Enter' three times: