Soon after starting game development with let's say Unreal Engine, you eventually find out (sometimes in an unpleasant way) that you need to effectively version control your project
Even after that, you find out that public host providers give you limited free storage for your beefy artifacts (textures, animations, 3D assets). We had the same issue at Vaslabs, so we decided to dive deep and remove this obstacle from our development pipeline!
In this blog post, I'll walk you through the steps we took to migrate from GitLab LFS to Amazon S3, which helped us significantly reduce costs and management overhead of the storage needed for our game. The process involved setting up a Terraform project, creating and configuring an S3 bucket, managing user permissions, and using lfs-dal to reroute Git LFS traffic to S3.
The problem
Git providers offer limited free storage space for repositories. However, once you exceed this limit, you must pay to continue using the service. This can become costly, especially for small video game companies or individual developers who need to store numerous assets, animations, and 3D models.
The solution
At Vaslabs we are Linux and FOSS enthusiasts and we actively support the open-source community where and when we can. So we created a repo ( https://github.com/vaslabs-ltd/git-s3-lfs ) that can help you create the necessary users and the S3 bucket to store your assets, files, etc., in AWS S3. Thus, you don't have to worry about exceeding limits and getting your progress blocked until you actively purchase more storage and it will cost significantly less. That's why we made this project, to minimize your development effort to get your own cheap and flexible LFS storage
Step 1: Set Up the Terraform Project
The first step in our migration journey was to set up the Terraform project. Terraform ( https://www.terraform.io/ ) is an excellent tool for managing infrastructure as code, and it allowed me to ensure everything was set up correctly before making any changes.
Initialize the Terraform project:
terraform init
Create the Terraform configuration files ( main.tf ) to define the necessary resources, including the S3 bucket and IAM users.
Create the variables.tf file we allow some configurability, but with sane defaults, you can hit the ground pretty fast
We can use the previous files as a module to create the s3 bucket and the users
module "s3_bucket" {
source = "PATH-TO-THE-PREVIUS-FILES-DIRECTORY"
s3_bucket_name = "my-s3-bucket"
environment = "dev"
project_name = "my-project"
new_users = {
"irodotos" : {
"iam" : "irodotos.gitlab"
"keybase" : "irodotos7"
},
"vasilis" : {
"iam" : "vasilis.gitlab"
"keybase" : "vasilis7"
}
}
tags = {
key1 = "value1",
key2 = "value2"
}
}
IMPORTANT NOTES HERE
Before you run it, make sure that every new user has a Keybase account ( https://keybase.io/ ) and that each has uploaded their public key to the Keybase app so it will be visible.
Run the Terraform plan to ensure the configuration is correct
terraform plan
Apply the Terraform plan to create the resources
terraform apply
The output of the Terraform apply is the encrypted password for each of the IAM users. Each one needs to take this encrypted password and decrypt it using their own private key from Keybase. After decryption, take the password and log in to your new AWS IAM user.
Important Notes
Take a close look at the examples and test directories to understand how to configure the variables correctly. Remember that you can also use existing users if you don't want to create new ones, and you can also use your own S3 bucket if you have one
Step 2: Every new user needs to log in with the iam permissions and create access keys ( we will need them )
Step 3: Disable GitLab LFS and Configure lfs-dal
With the S3 bucket and IAM users set up, the final step is to disable GitLab LFS and configure Git to use S3 instead.
1. Disable GitLab LFS in the GitLab project settings to ensure no new files are uploaded to GitLab's LFS.
2. Configure lfs-dal Execute the steps from the below repo
https://github.com/regen100/lfs-dal.git
By following these steps, we successfully migrated from GitLab LFS to S3, reducing our storage costs significantly. The process involved careful planning and execution, but the cost savings make it worthwhile. Additionally, we've also reduced the management overhead since we no longer need to monitor and purchase additional GBs of storage frequently. The scalability of S3 ensures that our storage needs are met without constant manual intervention.
We hope this guide helps you with your migration. If you have any questions or run into issues, please leave a comment below!
Comments