Scott Hanselman

Upgrading the Storage Pool, Drives, and File System in a Synology to Btrfs

October 15, 2020 Comment on this post [10] Posted in Reviews | Tools
Sponsored By

Making a 21TB Synology Storage PoolI recently moved my home NAS over from a Synology DS1511 that I got in May of 2011 to a DS1520 that just came out.

I have blogged about the joy of having a home server over these last nearly 10 years in a number of posts.

That migration to the new Synology is complete, and I used the existing 2TB Seagate drives from before. These were Seagate 2TB Barracudas which are quite affordable. They aren't NAS rated though, and I'm starting to generate a LOT of video since working from home. I've also recently setup Synology Active Backup on the machines in the house, so everyone's system is imaged weekly, plus I've got our G-Suite accounts backed up locally.

REFERRAL LINKS: I use Amazon links in posts like this. When you use them, you're supporting this blog, my writing, and helping pay for hosting. Thanks!

I wanted to get reliable large drives that are also NAS-rated (vibration and duty cycle) and the sweet spot right for LARGE drives now is a 10TB Seagate IronWolf NAS drive. You can also get 4TB drives for under $100! I'm "running a business" here so I'm going to deduct these drives and make the investment so I got 4 drives. I could have also got two 18TBs, or three 12TBs to similar effect. These drives will be added to the pool and become a RAID'ed roughly 21TB.

My Synology was running the ext4 file system on Volume1, so the process to migrate two all new drives and an all new file system was very manual, but very possible:

  • Use a spare slot and add one drive.
    • I had a hot spare in my 5 drive NAS so I removed it to make a spare slot. At this point I have my 4x2TB and 1x10TB in slots.
  • Make a new Storage Pool on the one drive
  • Make a new Volume with the newer Btrfs file system to get snapshots, self-healing, and better mirroring.
  • Copy everything from Volume1 to Volume2.
    • I copied from my /volume1 to /volume2. I made all new shares that were "Videos2" and "Software2" with the intention to rename them to be the primaries later.
  • Remove Volume1 by removing a drive at a time until the Synology decides it's "failed" and can be totally forgotten.
    • As I removed a 2TB drive, I replace it with a 10TB and expanded the new Storage Pool and the Volume2. These expansions take time as there's a complete consistency check.
    • Repeat this step for each drive.
  • You can either leave a single drive as Volume1 and keep your Synology Applications on them, or you can
  • When I've removed the final Storage Pool (as seen in the pic below) and my apps are either reinstalled on Volume 2 or I've moved them, I renamed all my shares from "Software2" etc to Software, removing the appended "2."

The wholes process took a few days with checkpoints in between. Be ready to have a plan, go slow, and execute on that plan, checking in as the file system consistency checks itself.

Removing drives

To be clear, another way would have been to copy EVERYTHING off to a single external drive, torch the whole Synology install, install the new drives, and copy back to the new install. There would have been a momentary risk there, with the single external holding everything. It's up to you, depending on your definitions of "easy" and "hassle." My way was somewhat tedious, but relatively risk free. Net net - it worked. Consider what works for you before you do anything drastic. Make a LOT OF BACKUPS. Practice the Backup Rule of Three.

Note you CAN remove all but one drive from a Synology as the "OS" seems to be mirrored on each drive. However, your apps are almost always on /volume1/@apps

Some Synology devices have 10Gbs connectors, but the one I have has 4x1Gbs. Next, I'll Link Aggregate those 4 ports, and with a 10Gbps desktop network card be cable to at get 300-400MB/s disk access between my main Desktop and the NAS.

The Seagate drives have worked great so far. My only criticism is that the drives are somewhat louder (clickier) than their Western Digital counterparts. This isn't a problem as the NAS is in a closet, but I suspect I'd notice the sound if I had 4 or 5 drives going full speed with the NAS sitting on my desk.

Here are my other Synology posts:

Hope this helps!


Sponsor: Have you tried developing in Rider yet? This fast and feature-rich cross-platform IDE improves your code for .NET, ASP.NET, .NET Core, Xamarin, and Unity applications on Windows, Mac, and Linux.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service

Classic Path.DirectorySeparatorChar gotchas when moving from .NET Core on Windows to Linux

October 13, 2020 Comment on this post [4] Posted in Azure | DotNetCore | Linux
Sponsored By

It's a Unix System, I know this!An important step in moving my blog to Azure was to consider getting this .NET app, now a .NET Core app, to run on Linux AND Windows. Being able to run on Linux and Windows would give me and others a wider choice of hosting, allow hosting in Linux Containers, and for me, save me money as Linux Hosting tends to be cheaper, even on Azure.

Getting something to compile on Linux is not the same as getting it to run, of course.

Additionally, something might run well in one context and not other. My partner Mark (poppastring) on this project has been running this code on .NET for a while, albeit on Windows. Additionally he runs on IIS in /blog as a subapplication. I run on Linux on Azure, and while I'm also on /blog, my site is behind Azure Front Door as a reverse proxy which handles the domain/blog/path and forwards along domain/path to the app.

Long story short, it's worked on both his blog and mine, until I tried to post a new blog post.

I use Open Live Writer (open sourced version of Windows Live Writer) to make a MetaWebLog API call to my blog. There's multiple calls to upload the binaries (PNGs) and a path is returned.  A newly uploaded binary might have a path like https://hanselman.com/blog/content/binary/something.png. The file on disk (from the server's perspective) might be d:\whatever\site\wwwroot\content\binary\something.png.

This is 15 year old ASP.NET 1, so there's some idiomatic stuff going on here that isn't modern, plus the vars have been added for watch window debugging, but do you see the potential issue?

private string GetAbsoluteFileUri(string fullPath, out string relFileUri)
{
var relPath = fullPath.Replace(contentLocation, "").TrimStart('\\');
var relUri = new Uri( relPath, UriKind.Relative);
relFileUri = relUri.ToString();
return new Uri(binaryRoot, relPath).ToString();
}

That '\\' is making a big assumption. A reasonable one in 2003, but a big one today. It's trimming a backslash off the start of the passed in string. Then the Uri constructor starts coming things and we're mixing and matching \ and / and we end up with truncated URLs that don't resolve.

Assumptions about path separators are a top issue when moving .NET code to Linux or Mac, and it's often buried deep in utiltiy methods like this.

var relPath = fullPath.Replace(contentLocation, String.Empty).TrimStart(Path.DirectorySeparatorChar);

We can use the correct constant for Path.DirectorySeparatorChar, or the little-known AltDirectorySeparatorChar as Windows supports both. That's why this code works on Mark's Windows deployment but doesn't break until it runs on my Linux deployment.

DOCS: Note that Windows supports either the forward slash (which is returned by the AltDirectorySeparatorChar field) or the backslash (which is returned by the DirectorySeparatorChar field) as path separator characters, while Unix-based systems support only the forward slash.

It's also worth noting that each OS has different invalid path chars. I have some 404'ed images because some of my files have leading spaces on Linux but underscores on Windows. More on that )(and other obscure but fun bugs/behaviors) in future posts.

static void Main()
{
Console.WriteLine($"Path.DirectorySeparatorChar: '{Path.DirectorySeparatorChar}'");
Console.WriteLine($"Path.AltDirectorySeparatorChar: '{Path.AltDirectorySeparatorChar}'");
Console.WriteLine($"Path.PathSeparator: '{Path.PathSeparator}'");
Console.WriteLine($"Path.VolumeSeparatorChar: '{Path.VolumeSeparatorChar}'");
var invalidChars = Path.GetInvalidPathChars();
Console.WriteLine($"Path.GetInvalidPathChars:");
for (int ctr = 0; ctr < invalidChars.Length; ctr++)
{
Console.Write($" U+{Convert.ToUInt16(invalidChars[ctr]):X4} ");
if ((ctr + 1) % 10 == 0) Console.WriteLine();
}
Console.WriteLine();
}

Here's some articles I've already written on the subject of legacy migrations to the cloud.

If you find any issues with this blog like

  • Broken links and 404s where you wouldn't expect them
  • Broken images, zero byte images, giant images
  • General oddness

Please file them here https://github.com/shanselman/hanselman.com-bugs and let me know!

Oh, and please subscribe to my YouTube and tell your friends. It's lovely.


Sponsor: Have you tried developing in Rider yet? This fast and feature-rich cross-platform IDE improves your code for .NET, ASP.NET, .NET Core, Xamarin, and Unity applications on Windows, Mac, and Linux.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service

Migrating this blog to Azure. It's done. Now the work begins.

October 08, 2020 Comment on this post [13] Posted in ASP.NET | Azure
Sponsored By

imageI have been running this https://hanselman.com/blog for almost 20 years. Like coming up on 19 I believe.

Recently it moved from being:

  • a 13(?) year old .NET Framework app called DasBlog running on ASP.NET and a Windows Server on real metal hardware

to

Finally. This blog, the main site, and the podcast site are all running on Azure Web Apps, built in Azure DevOps, and managed by Azure Front Door and watched by Application Insights. Yes I pay for it with cash, I have no unlimited free Azure credits other than my $100 MSDN account.

Mark and I have been pairing on this for months and having a wonderful time. In fact, it's been about a year since this started.

Moving this blog is a significant achievement for a number of reasons, IMHO.

  • If we did it right:
    • you didn't notice anything
    • The URLs look cooler.
    • We broke nothing in SEO.
    • Perf is better.
    • Before I could deploy the site a few times a year, and was afraid of it. Yesterday I deployed 11 times.
  • It was .NET 1.1, then 2.0, then 3.5, then 4.0, then stuck for 8 years.
    • It ran on a real Windows Server 2008 machine (no VM) at Sherweb who has been a great partner for years. Extremely reliable hosting!
    • Now it's on Azure under Linux
  • We upgraded the ASP.NET WebForms app to ASP.NET Core with Mark's genius idea of splitting the app responsibilities such that the original DasBlog blog templating language could be converted to simple Razor pages and we could use ASP.NET TagHelpers to replace WebForms controls.
    • This allowed me to port my template over in a day with minimal changes.
    • Once it compiled under .NET Core it was easy to move it from Windows to Linux and testing in WSL first.
    • We then just moved the other dependent projects to .NET Standard 2 and compiled the while thing as a .NET Core 3.1 LTS (Long Term Support) app. In fact, scroll down to the VERY bottom of this page and you can see what version we're on.
  • I set up CI/CD for the main site hanselman.com, this blog, and hanselminutes.com.
    • There are 3 sites now, all behind a reverse proxy from Azure Front Door to handle SSL, Firewalls, and more.

Next steps? Keep it running, watch for errors, 5xx and 4xx and make small incremental changes. The pages are still heavy, while ASP.NET has server response time under 20ms, there's still 2 sec of JavaScript and bunch of old crap to clean up. I've also got two decades of links, so I'm fixing 404s as they are reported or they show up in Application Insights. I made a Dashboard here:

image

I'm going spend the next month or so blogging about the process and experience in as much detail as I can.

Here's some articles I've already written on the subject:

If you find any issues with this blog like

  • Broken links and 404s where you wouldn't expect them
  • Broken images, zero byte images, giant images
  • General oddness

Please file them here https://github.com/shanselman/hanselman.com-bugs and let me know!

Oh, and please subscribe to my YouTube and tell your friends. It's lovely.


Sponsor: Never miss a beat with Seq. Live application logs and health checks. Download the Windows installer or pull the Docker image now.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service

Keeping your WSL Linux instances up to date automatically within Windows 10

October 06, 2020 Comment on this post [9] Posted in Linux | Win10
Sponsored By

image[3]Hayden Barnes from Canonical, the folks that work on Ubuntu (lovely blog, check out it) had a great tweet where he recommended using the Windows Task Scheduler (think of it as a graphical cron job manager) to keep your WSL Linux instances up to date.

There's a few things to unpack here to get into the details.

First, if you run wsl --list -v you'll see all the WSL Linux Instances on your machine.

> wsl --list -v
NAME STATE VERSION
* Ubuntu-18.04 Running 2
kali-linux Stopped 1
Alpine Stopped 1
Ubuntu-20.04 Stopped 2
WLinux Running 2
docker-desktop-data Stopped 2
docker-desktop Stopped 2

You can I see I have a few. I spend most of my time in the Ubuntu instances, but I also occasionally drop into the kali-linux and WLinux instances. If I'm using LTS (long term support) distros then there's minimal risk (my opinion) in "apt get update" and "apt get upgrade"-ing them every week or so. I could even do it unattended.

I could set up a Task Scheduler and make an "on login" task or a weekly task that calls wsl.exe and passes in -d for distro, along with the name of the distro, run as root with -u and -e for the command. For example:

wsl -d "Wlinux" -u root -e apt update
wsl -d "Wlinux" -u root -e apt upgrade -y

Since I have several WSL instances, I could also make a "updateall.cmd" or .bat or .ps1 script and run them occasionally to keep them all updated on my own. Just change the -d and include the name of each distro. One could imagine a group policy as well for large enterprises to do the same thing for developers using a custom or managed WSL instance.

You would not want to update or mess with the docker- managed WSL instances above as they exist only to run your Docker Desktop-managed containers. Leave that to Docker to manage.

It's a whole new world out there, and I'm loving how I can move easily between multiple Linuxes on Windows 10. Check out my YouTube on WSL2 and please subscribe over there.


Sponsor: Never miss a beat with Seq. Live application logs and health checks. Download the Windows installer or pull the Docker image now.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service

How to use autocomplete at the command line for dotnet, git, winget, and more!

October 02, 2020 Comment on this post [12] Posted in Open Source
Sponsored By

Many years ago .NET Core added Command line "tab" completion for .NET Core CLI in PowerShell or bash but few folks have taken the moment it takes to set up.

I enjoy setting up and making my prompt/command line/shell/terminal experience as useful (and pretty) as possible. You have lots of command line shells to choose from on Windows!

Keep in mind these are SHELLs not terminals or alternative consoles. You can run all of these in the Windows Terminal.

  • Classic cmd.exe command prompt (fake DOS!) - Still useful and Clink can make it more bash-y without going full bash.
  • Yori from Malcolm Smith
  • Starship - more on this later, it's somewhat unique
  • Windows PowerShell - a classic because...
  • PowerShell 7 is out and runs literally anywhere, including ARM machines!

I tend to use PowerShell 7 (formerly PowerShell Core) as my main prompt because it's a cross-OS prompt. I can use the same prompt, same functions, same everything on Windows and Linux.

But it's command-line autocompletion that brings me the most joy!

  • git ch<TAB> -> git checkout st<TAB> -> git checkout staging
  • dotnet bu<TAB> -> dotnet build
  • dotnet --list-s<TAB> -> dotnet --list-sdks
  • winget in<TAB> -> winget install -> winget install WinDi<TAB> -> winget install WinDirStat

Once you have successfully tab'ed you way to glory it's hard to stop. With PowerShell and its cousins this is made possible with Register-ArgumentCompleter. Here's what it looks like for the dotnet CLI.

# PowerShell parameter completion shim for the dotnet CLI
Register-ArgumentCompleter -Native -CommandName dotnet -ScriptBlock {
param($commandName, $wordToComplete, $cursorPosition)
dotnet complete --position $cursorPosition "$wordToComplete" | ForEach-Object {
[System.Management.Automation.CompletionResult]::new($_, $_, 'ParameterValue', $_)
}
}

Looks like a lot, but the only part that matters is that when it sees the command "dotnet" and some partial text and the user presses TAB, it will call "dotnet complete" passing in the cursorPosition and the wordToComplete.

NOTE: If you understand how this works, you can easily make your own Argument Completer for those utilities that you use all the time at work! You can make them for the folks at work who use your utilities!

You never actually see this call to "dotnet complete." You just see yourself typing dotnet bui<TAB> and getting a series of choices to tab through!

Here's what happens behind the scenes:

>dotnet complete --position 3 bui
build
build-server
msbuild

You can add these to your $profile. Usually I run 'notepad $profile" at the command line and it will autocreate the correct file in the correct location.

This is a super powerful pattern! You can get autocomplete in Git in PowerShell with PoshGit as well as in WinGet!

What are some more obscure autocompletes that you have added to your PowerShell profile?

ACTION: Finally, please take a moment and subscribe to my YouTube or head over to http://computerstufftheydidntteachyou.com and explore! I'd love to hit 100k subs over there. I heard they give snacks.


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.