I have been working on my side project LabHub for a while now and I want to share how it started and how it has evolved over time.

Before LabHub

When I was in the early years of college I was taking Cisco Networking Academy courses and I had been using Packet Tracer for a while. I wasn’t that interested in networking but I had to take the courses because they were part of the curriculum. I was more interested in programming and web development.

One day, my instructor was giving a lecture and I noticed that he was using a software that I had never seen before, it was not Packet Tracer and I was curious about it because it looked more modern and it had more features. I realized that it was PNetLab, a network emulator forked from eve-ng. I immediately searched for it and found their official website. I read the documentation and spinned up a Google Cloud VM to install it. I was very excited to try it out but I was frustrated because I couldn’t get it to work.

I was missing a key component, the images. I searched for the images but couldn’t find them anywhere. I googled for hours and I even tried to build the images myself but I couldn’t do it. At that time I didn’t know much about QEMU and I didn’t know how to build the images (later I found out it was actually really easy), but even if I had managed build some of them, it would’ve taken me forever to upload them to my server with my 2 mbps upload speed. I tried asking on the PNetLab support group but I was told to use ishare. I had already tried ishare and it was not working for me, it was just showing the same error message that I figured, was from Google Drive quota limits. If I wanted to get the images using ishare I had to wait for the next day to get the quota reset.

The next day I asked my instructor how he got the images and he told me that he used ishare but he also told me that I had to wait for the quota to reset. After class, I kept digging deeper and found some videos in Spanish in the KaLiNet YouTube Channel about how to setup PNetLab and I found out that there was a Telegram group where people shared the images. I joined the group and I was able to get the images from a Google Drive link that was shared in the group. After digging deeper on the Internet, I also found a Google Shared Drive with a lot of images. I was very happy because I was finally able to get the images and I was able to use PNetLab.

UNetLab Cloud

After finally getting the images I started using PNetLab and I was very happy with it. I was able to do labs that I couldn’t do with Packet Tracer and I was able to do more complex labs using the full operating system images not just a simulation of them. I then thought about how I could make sure I always had the images available to me and I thought about creating a website where I could upload the images and share them with others too. I remember that I had seen one of the members of the Android community do something similar with Android ROMs, he used a Google Shared Drive and Google Drive indexer that uses the Google Drive API to list the files in the drive. I thought that I could do something similar with the images. A repository of QEMU, IOL and Dynamips images for testing and simulation on PNetLab, eve-ng or any other QEMU based emulator.

I wasn’t very knowledgeable about what the limitations of the Google Drive API were but I set that up anyway. I created a Google Shared Drive and started copying the images one by one to the Shared Drive I created. It took me a long time to copy all the images to the Shared Drive because I didn’t know how to automate the process of copying folders from one Shared Drive to another but it worked. Here’s what I used to make the index:

  • Free Google Shared Drive from MSGSuite: at that time they were giving free Shared Drives with unlimited storage from managers of Google Workspace edu accounts with them, and users were able to request a Shared Drive from them, then it was automatically created through the Google API and your email was added as an owner of the Shared Drive. This is no longer possible because Google limited the storage to education accounts to as 100 TB pooled storage for all users in the domain a while ago.
  • Cloudflare Workers & Google Drive Indexer: I looked for a Google Drive Indexer on GitHub and I found a good looking fork of the goindex project that had a nice UI and some extra features, goindex-extended. These indexers work with Cloudflare workers and they use the Google Drive API in the background to list the files in the Shared Drive and serve them to the user. I followed the instructions in the README and I was able to get it working. I then added the Cloudflare worker URL to my a subdomain and I had a working index of the images in the Shared Drive.

After I had the index working I shared the link to the index in the Telegram group and people started using it. I was very happy because I was able to help people that were in the same situation as me and the best part was that the quota limits were not a problem (at least for a while) and everyone was able to get the images without having to wait for the quota to reset of Google Drive on ishare.

At first I published the index in a subdomain of my personal website but later as the userbase kept growing, I wanted to have a dedicated domain for the index so it was time to think of a name. PNetLab is based on eve-ng and eve-ng is based on UNetLab and also the path where the images are stored in a PNetLab/eve-ng installation is /opt/unetlab/addons/, so I thought that UNetLab Cloud would be a good name for the project. I then registered the domain unetlab.cloud and I moved the index to the new domain.

Scaling UNetLab Cloud

Since I started sharing the repository in the PNetLab group, I noticed that the userbase kept growing and I was receiving a lot of requests to the Cloudflare Worker. I was using the free tier of Cloudflare Workers but I hit the limit of 100,000 requests only a couple of days. Still, I needed to setup alternate domains and workers to keep the service up and running in case the worker was rate limited.

I also learned how to automate the copy of the images from one Shared Drive to another as a backup, since the Shared Drives I had setup initially could be taken down at any time by the Workspace owners. I always kept clones of the original shared drive and synced the others with the original one. rclone quickly became my go-to solution for that problem because it allows mounting multiple Shared Drives and executing many operations easily. I even setup a Telegram bot to easily manage the images and the Shared Drives. If I someone shared a new image through Google Drive I could easily add it to the index with the bot with a simple command.

Another issue that came later is that the Google Drive API only allows up to 750GB of transfers per day so once the users started exceeding that, I needed something similar to what I did with rclone, Service Accounts and a Google Drive indexer that supported rotating the Accounts, so I ended up switching to a new indexer: Google Drive Index by Hash Hackers.

ishare2

As unetlab.cloud kept growing, I started thinking about building an alternative to ishare. I wanted to build a service that was easy to use and that didn’t have the limitations of ishare. But there was a big problem, I didn’t know how to build a JSON API out of the Google Drive API with which I could index all the available images and feed that information to the new ishare client to tell it which images were available and their download links. If I shared the links to the images directly, the users would have the exact same problem as with ishare, the quota limits. So I needed to index the images from my Google Drive indexer and serve them to the users through it.

I didn’t want to create all the index manually, I was still thinking of a solution to automate the new indexing process, but then someone DM’d me on Telegram and told me that they were building a new ishare client and they wanted me to see it and try it out. I was wondering how they were going to make an index out of my site and I was surprised when I saw that they did it all manually 🤯 on a huge Google Spreadsheet that was used as database. That dedicated man’s name is Mati. And that’s how ishare2 was born.

For a long time we kept updating the Google Spreadsheet manually with the new images and their download links, but then I thought that we could automate the process and save a lot of time. So I started working on a new way of indexing the images and saving them to a JSON file that could be hosted on GitHub and served to the ishare2 client. I used rclone to mount the Shared Drives and I made some python scripts that would list the files in the Shared Drives and save the information to a JSON file. I then hosted the JSON file on GitHub and I made the ishare2 client read the JSON file and show the images to the users from there.

The whole ishare2 cli is written in bash and I am developing different implementations of it. I might talk about it in a future post.

Migrating from Google Drive to OneDrive

As of January 23, 2023, Google was going to start limiting storage to edu accounts to 100 TB pooled storage for all users in the domain. I was using a Google Shared Drive from MSGSuite and I was going to lose the unlimited storage I had. I had to find a new solution to store the images and serve them to the users. I thought about using a Google Workspace account but I didn’t want to pay for it. So I started looking for alternatives and I found that Microsoft OneDrive has a Developer Plan that gives you 5 TB of storage for free.

I thought that it was a good deal and I started migrating the images to OneDrive. I also had to find a new indexer that supported OneDrive and I found onedrive-vercel-index which as the name suggests, is a OneDrive indexer that can be deployed to Vercel. I followed the instructions in the documentation and I was able to get it working. In December 29, 2022, I made beta subdomain for the new index and I started testing it with the users. I then switched the main domain in January 22, 2023 and everything went smoothly. Me and Mati (the creator of ishare2) also started working on a new ishare2 update that would support the new OneDrive indexer and we were able to get it working that same day. I was very happy with the result and I was able to keep the service up and running to this day.

Transitioning to LabHub

About a year ago, I started thinking about renaming the project to something more generic and separate it from the UNetLab brand because it started getting more attention than UNetLab itself on Google. I also wanted avoid confusion with the UNetLab project. So in February 2023, I made a Telegram poll with different name options for the project and the users voted for LabHub and that’s how the project was renamed to LabHub. I then registered the domain labhub.eu.org using NIC.EU.ORG which is a free domain registration service and I didn’t have to pay for that domain anymore and use my money for other things.

I started migrating the index to the new domain and added a redirect from the old domain to the new one. I also wanted to get a new logo for the project so I asked qirkl, a designer I met on Telegram, to design a logo for LabHub. I was very happy with the result and I started using the new logo on the website.

I updated the indexes to point to the new domain and everything went smoothly. Now unetlab.cloud is worth $3,911.00 USD and sometimes I regret dropping the domain but I think it was the right decision to make.

The Future of LabHub & ishare2

I will keep working on this project for as long as I can in my free time. I have a lot of ideas for the future of LabHub and ishare2 and I want to keep improving them. But most importantly, I want to document the process and share it with others so they can take my ideas and maybe build something even better. I will keep writing blog posts about how I develop the project and I will keep sharing the code on GitHub. I hope that this project will help others in the future and that it will be a good learning experience for me.