In this guide, I’m going to show you how to copy (upload) files to Google Drive using a simple Powershell script. This is a great way to automate the process of moving items to the cloud for backup and retention purposes.
On my home server, I have a storage array specifically for backups (
R:\) I use Veeam to take an image backup of my personal computer, my Home Assistant backup, Plex Metadata backups, Tautulli backups, etc. Everything goes to this one drive letter.
Having all backups in one location is great, but it doesn’t quite meet the 3-2-1 backup rule where you have 3 copies of your data, on 2 different mediums, with 1 copy being offsite.
This script allows me to meet the “offsite” portion of that rule. If you’re looking for a simple way to copy data to Google Data on a schedule with Windows 10, 11, server 2019, or server 2022, then let’s get started!
How It Works
The premise of the setup is simple. We are downloading the Google Drive for desktop application. This maps Google Drive as a drive letter in Windows.
Then, create a Powershell script that uses the
Copy-Item cmdlet to choose a Source and Destination folder. If you want to exclude certain folders from uploaded, I’ll include steps on how to do that in the script as well.
Then, you can use Windows Task Scheduler to schedule how frequently you want the autobackup to Google Drive to run.
Why use Powershell and not just select Google Drive as the source folder?
There are a number of reasons I like using Powershell for this. You could just download Google Drive for Desktop and use store you backups directly on that, but then you don’t technically have a local copy. If you need to restore, it’s going to be faster from a local drive rather than the cloud.
In addition, you can schedule how frequently you want the backup to run using a scheduled task. If you use Google Drive directly as the destination, it syncs in real-time. That means as soon as something ends up in your destination folder of Google Drive, then it immediately syncs to Google which not be ideal if a large file was just uploaded there.
I also like that this method lets you exclude certain folders. Like I said, I have some folders on the backup array that contain large files; I don’t want those uploaded to Google Drive all the time. So what I did was create 1 script to copy everything except the excluded folders weekly, and then created a 2nd script to copy those large folders monthly.
Step 1: Download Google Drive for Desktop
Assuming you already a Google account, download Google Drive for Desktop from here. Once downloaded, it’ll prompt you to sign in.
After signing in, you’ll see a new mapped drive for Google Drive and it should also be running in your taskbar. Make note of the drive letter.
Then, open the mapped drive. Click into My Drive and create a new folder.
Step 2: Create Powershell Script to Copy Folder to Google Drive
Open Powershell ISE.
Note: If your source destination is the C:\ drive, you can use the folder path (
C:\users\administrator\desktop\Backup). But if your source is a secondary drive, you’ll need to use the UNC path (
\\server\r$\), where r$ is the source drive letter.
For a one-time file or folder copy, you can run this. This will copy all files, folders, and subfolders.
Copy-Item "\\danny-server\r$\" -Destination "G:\My Drive\HomeBackup\" -Recurse
Copy Files to Google Drive using Powershell and Exclude Folder
Here is my source folder. In the script below, replace the green text with the folder names you’d like to exclude from backing up to your Drive.
$path = "\\danny-server\r$\" $destination = "G:\My Drive\HomeBackup\" Copy-Item -Path (Get-Item -Path "$path\*" -Exclude ('Backup', 'BackupCopy', 'HABackup', 'PlexBackup', 'SmartHomePursuitsBackup', 'VeeamBackup')).FullName -Destination $destination -Recurse -Force
That’s it! The last step is save your Powershell script to a directory (GoogleDriveBackup.ps1). I keep all of scripts in a folder like C:\Scripts.
Then, search Windows for “Task Scheduler”. Create a basic task, choose a date and time it’ll run under Triggers. Under Action, Start a Program and point to the location you stored your script (C:\Scripts\GoogleDriveBackup.ps1).
You now have a way to automate copying files to Google Drive!
My Homelab Equipment
Here is some of the gear I use in my Homelab. I highly recommend each of them.
- Server 2019 w/ Hyper-V
- Case: Fractal Design Node 804
- Graphics Card: NVIDEA Quadro K600
- CPU: AMD Ryzen 7 2700
The full list of server components I use can be found on my Equipment List page.
Thx for article!
I tested code: Copy-Item -Path C:\testcopy\ -Destination C:\testcopy2\ -Recurse and it works fine.
But when i try to use it with G: :
Copy-Item -Path C:\testcopy\ -Destination G:\My Drive\test\ -Recurse
It gives me error:
Copy-Item : A positional parameter cannot be found that accepts argument ‘Drive\test\’.
At line:1 char:1
+ Copy-Item -Path C:\testcopy\ -Destination G:\My Drive\test\ -Recurse
+ CategoryInfo : InvalidArgument: (:) [Copy-Item], ParameterBindingException
+ FullyQualifiedErrorId : PositionalParameterNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
Google drive is properlly mapped to G:
I think you’d just need to wrap My Drive in quotes because there’s a space between the words, like this: G:\”My Drive”\test\ -Recurse
Thx, Danny. I tried to change the code to upload data to mega.io for example, but i still get an error. could you help me? Thanks