Scott Hanselman

Back to Basics: 32-bit and 64-bit confusion around x86 and x64 and the .NET Framework and CLR

February 11, 2009 Comment on this post [34] Posted in ASP.NET | Back to Basics | Learning .NET | Programming | Windows Client
Sponsored By

I'm running 64-bit operating systems on every machine I have that is capable. I'm running Vista 64 on my quad-proc machine with 8 gigs of RAM, and Windows 7 Beta 64-bit on my laptop with 4 gigs.

Writing managed code is a pretty good way to not have to worry about any x86 (32-bit) vs. x64 (64-bit) details. Write managed code, compile, and it'll work everywhere.

The most important take away, from MSDN:

"If you have 100% type safe managed code then you really can just copy it to the 64-bit platform and run it successfully under the 64-bit CLR."

If you do that, you're golden. Moving right along.

WARNING: This is obscure and you probably don't care. But, that's this blog.

Back to 32-bit vs. 64-bit .NET Basics

I promote 64-bit a lot because I personally think my 64-bit machine is snappier and more stable. I also have buttloads of RAM which is nice.

The first question developer ask me when I'm pushing 64-bit is "Can I still run Visual Studio the same? Can I make apps that run everywhere?" Yes, totally. I run VS2008 all day on x64 machines writing ASP.NET apps, WPF apps, and Console apps and don't give it a thought. When I'm developing I usually don't sweat any of this. Visual Studio 2008 installs and runs just fine.

If you care about details, when you install .NET on a 64-bit machine the package is bigger because you're getting BOTH 32-bit and 64-bit versions of stuff. Some of the things that are 64-bit specific are:

  • Base class libraries (System.*)
  • Just-In-Time compiler
  • Debugging support
  • .NET Framework SDK

For example, I have a C:\Windows\Microsoft.NET\Framework and a C:\Windows\Microsoft.NET\Framework64 folder.

If I File|New Project and make a console app, and run it, this is what I'll see in the Task Manager:

64-bit app running in Task Manager

Notice that a bunch of processes have *32 by their names, including devenv.exe? Those are all 32-bit processes. However, my ConsoleApplication1.exe doesn't have that. It's a 64-bit process and it can access a ridiculous amount of memory (if you've got it...like 16TB, although I suspect the GC would be freaking out at that point.)

That 64-bit process is also having its code JIT compiled to use not the x86 instruction set we're used to, but the AMD64 instruction set. This is important to note: It doesn't matter if you have an AMD or an Intel processor, if you're 64-bit you are using the AMD64 instruction set. The short story is - Intel lost. For us, it doesn't really matter.

Now, if I right click on the Properties dialog for this Project in Visual Studio I can select the Platform Target from the Build Tab:

image

By default, the Platform Target is "Any CPU." Remember that our C# or VB compiles to IL, and that IL is basically processor agnostic. It's the JIT that makes the decision at the last minute.

In my case, I am running 64-bit and it was set to Any CPU so it was 64-bit at runtime. But, to repeat (I'll do it a few times, so forgive me) the most important take away, from MSDN:

"If you have 100% type safe managed code then you really can just copy it to the 64-bit platform and run it successfully under the 64-bit CLR."

Let me switch it to x86 and look at Task Manager and we see my app is now 32-bit.

image

Cool, so...

32-bit vs. 64-bit - Why Should I Care?

Everyone once in a while you'll need to call from managed code into unmanaged and you'll need to give some thought to 64-bit vs. 32-bit. Unmanaged code cares DEEPLY about bitness, it exists in a processor specific world.

Here's a great bit from an older MSDN article that explains part of it with emphasis mine:

In 64-bit Microsoft Windows, this assumption of parity in data type sizes is invalid. Making all data types 64 bits in length would waste space, because most applications do not need the increased size. However, applications do need pointers to 64-bit data, and they need the ability to have 64-bit data types in selected cases. These considerations led the Windows team to select an abstract data model called LLP64 (or P64). In the LLP64 data model, only pointers expand to 64 bits; all other basic data types (integer and long) remain 32 bits in length.

The .NET CLR for 64-bit platforms uses the same LLP64 abstract data model. In .NET there is an integral data type, not widely known, that is specifically designated to hold 'pointer' information: IntPtr whose size is dependent on the platform (e.g., 32-bit or 64-bit) it is running on. Consider the following code snippet:

[C#]
public void SizeOfIntPtr() {
Console.WriteLine( "SizeOf IntPtr is: {0}", IntPtr.Size );
}

When run on a 32-bit platform you will get the following output on the console:

SizeOf IntPtr is: 4

On a 64-bit platform you will get the following output on the console:

SizeOf IntPtr is: 8

Short version: If you're using IntPtrs, you care. If you're aware of what IntPtrs are, you likely know this already.

If you checked the property System.IntPtr.Size while running as x86, it'll be 4. It'll be 8 while under x64. To be clear, if you are running x86 even on an x64 machine, System.IntPtr.Size will be 4. This isn't a way to tell the bitness of your OS, just the bitness of your running CLR process.

Here's a concrete example. It's a pretty specific edgy thing, but a decent example. Here's an app that runs fine on x86.

using System;
using System.Runtime.InteropServices;

namespace TestGetSystemInfo
{
public class WinApi
{
[DllImport("kernel32.dll")]
public static extern void GetSystemInfo([MarshalAs(UnmanagedType.Struct)] ref SYSTEM_INFO lpSystemInfo);

[StructLayout(LayoutKind.Sequential)]
public struct SYSTEM_INFO
{
internal _PROCESSOR_INFO_UNION uProcessorInfo;
public uint dwPageSize;
public IntPtr lpMinimumApplicationAddress;
public int lpMaximumApplicationAddress;
public IntPtr dwActiveProcessorMask;
public uint dwNumberOfProcessors;
public uint dwProcessorType;
public uint dwAllocationGranularity;
public ushort dwProcessorLevel;
public ushort dwProcessorRevision;
}

[StructLayout(LayoutKind.Explicit)]
public struct _PROCESSOR_INFO_UNION
{
[FieldOffset(0)]
internal uint dwOemId;
[FieldOffset(0)]
internal ushort wProcessorArchitecture;
[FieldOffset(2)]
internal ushort wReserved;
}
}

public class Program
{
public static void Main(string[] args)
{
WinApi.SYSTEM_INFO sysinfo = new WinApi.SYSTEM_INFO();
WinApi.GetSystemInfo(ref sysinfo);
Console.WriteLine("dwProcessorType ={0}", sysinfo.dwProcessorType.ToString());
Console.WriteLine("dwPageSize ={0}", sysinfo.dwPageSize.ToString());
Console.WriteLine("lpMaximumApplicationAddress ={0}", sysinfo.lpMaximumApplicationAddress.ToString());
}
}
}

See the mistake? It's not totally obvious. Here's what's printed under x86 (by changing the project properties):

dwProcessorType =586
dwPageSize      =4096
lpMaximumApplicationAddress =2147418111

The bug is subtle because it works under x86. As said in the P/Invoke Wiki:

The use of int will appear to be fine if you only run the code on a 32-bit machine, but will likely cause your application/component to crash as soon as it gets on a 64-bit machine.

In our case, it doesn't crash, but it sure is wrong. Here's what happens under x64 (or AnyCPU and running it on my x64 machine):

dwProcessorType =8664
dwPageSize      =4096
lpMaximumApplicationAddress =-65537

See that lat number? Total crap. I've used an int as a pointer to lpMaximumApplicationAddress rather than an IntPtr. Remember that IntPtr is smart enough to get bigger and point to the right stuff. When I change it from int to IntPtr:

dwProcessorType =8664
dwPageSize      =4096
lpMaximumApplicationAddress =8796092956671

That's better. Remember that IntPtr is platform/processor specific.

You can still use P/Invokes like this and call into unmanaged code if you're on 64-bit or if you're trying to work on both platforms, you just need to be thoughtful about it.

Gotchas: Other Assemblies

You'll also want to think about the assemblies that your application loads. It's possible that when purchasing a vendor's product in binary form or bringing an Open Source project into your project in binary form, that you might consume an x86 compiled .NET assembly. That'll work fine on x86, but break on x64 when your AnyCPU compiled EXE runs as x64 and tries to load an x86 assembly. You'll get a BadImageFormatException as I did in this post.

The MSDN article gets it right when it says:

...it is unrealistic to assume that one can just run 32-bit code in a 64-bit environment and have it run without looking at what you are migrating.

As mentioned earlier, if you have 100% type safe managed code then you really can just copy it to the 64-bit platform and run it successfully under the 64-bit CLR.

But more than likely the managed application will be involved with any or all of the following:

  • Invoking platform APIs via p/invoke
  • Invoking COM objects
  • Making use of unsafe code
  • Using marshaling as a mechanism for sharing information
  • Using serialization as a way of persisting state

Regardless of which of these things your application is doing it is going to be important to do your homework and investigate what your code is doing and what dependencies you have. Once you do this homework you will have to look at your choices to do any or all of the following:

  • Migrate the code with no changes.
  • Make changes to your code to handle 64-bit pointers correctly.
  • Work with other vendors, etc., to provide 64-bit versions of their products.
  • Make changes to your logic to handle marshaling and/or serialization.

There may be cases where you make the decision either not to migrate the managed code to 64-bit, in which case you have the option to mark your assemblies so that the Windows loader can do the right thing at start up. Keep in mind that downstream dependencies have a direct impact on the overall application.

I have found that for smaller apps that don't need the benefits of x64 but need to call into some unmanaged COM components that marking the assemblies as x86 is a reasonable mitigation strategy.  The same rules apply if you're making a WPF app, a Console App or an ASP.NET app. Very likely you'll not have to deal with this stuff if you're staying managed, but I wanted to put it out there in case you bump into it.

Regardless, always test on x86 and x64.

Related Posts

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service
February 11, 2009 11:34
What about Itanium? That's 64bit too... IntPtr will be 8bytes just like on x64...

There has to be a better way than checking the size of IntPtr - new architectures are coming up all the time.
February 11, 2009 12:06
I disagree. Itanium is dead, and x86 will be dead in 10 years, right now there is only x64, no? What "new architectures" are coming up all the time? We have a while to worry before 128bit.
February 11, 2009 12:13
I'm just saying that 4bytes = x86 and 8 = x64 won't always be correct. There are CLR ports/implementations for lots of platforms - ARM, PowerPC, etc. Checking IntPtr seems like measuring a side effect.

(I agree that Itanium is dead though, it was just a handy example because it shows up in the drop down box in that screenshot.)
February 11, 2009 12:31
> The first question developer ask me when I'm pushing 64-bit is "Can I still run Visual Studio the same?

The answer is yes and no. If you've ever struggled with sn.exe on a 64bit box, you know what I mean:

https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=341426

February 11, 2009 12:37
Fowl - Ok, I'll give you the ports thing, but again, if you write managed code only, there's no trouble. If you're doing CLR code on platforms as diverse as Intel and ARM, then you've got bigger issues if you're sharing unmanaged code also. ;)

RichB - Sure, but on a 64-bit machine there's two CLRs...sn for 32-bit only works for that CLR. But, TOTALLY agreed, it's confusing.
February 11, 2009 13:16
Another one that's got me in the past with interop, other than the size of IntPtr, is the byte alignment of structures, or packing. Packing essentially ensures that each member of a structure is aligned to a certain sized byte boundary. So if a structure is packed to 4 bytes then every member of the structure will start on a 4 byte boundary and be allocated some multiple of 4 bytes of space in the structure. It's fairly common in the Windows SDK to have a structure on 32-bit Windows that is packed differently on 64-bit Windows. The way to spot this is looking for #pragma pack in the Windows SDK header files. This often means you need to declare different .NET representations of the structure with different packing so that your architecture agnostic .NET code will work on both platforms. Check out StructLayoutAttribute.Pack and #pragma pack on MSDN.
February 11, 2009 13:27
Hi.
I actually ran into this problem just earlier this week, and it took me a while to figure out. The problem originated in a third party assembly that loaded fine into studio, worked great in unit testing(since the tests run inside VS*32), but failed horribly while running the application. I've checked the third party assembly with reflector, and it doesn't have any calls to unmanaged code; beats me why they decided to target x86 on it.
I've got a problem in that my solution is large, consisting of some 16 projects that are fairly heavily intertwined. Is there any way to have a project that can call a x86 targeted dll while still being loadable by a 64 bit application, or do I need to convert my entire solution to x86?

The worst part of this is that it will be running on a 32bit server, I'm just having problems debugging it since my development box is 64bit.
February 11, 2009 13:42
"Can I still run Visual Studio the same?" - not if you want to use edit and continue.

Like you, most of our dev machines are 64-bit - but it feels like 64-bit compat is an afterthought, even for developer tools (I'd have thought devs at MS would be the first to switch over).
February 11, 2009 14:18
When you're using a managed C++ dll, similar issues exist. The Itanium case is even worse here though because you'll need to recompile the MC++ dll targeted at Itanium. There's a lot more going on than just IntPtr sizes.

For now I share the thought Rico Mariani has on wether Visual Studio should be 32 or 64 bit; there aren't that many applications that need that much memory space. It's better to reduce footprint than to require 64bit.

Also, if you do need the extra memory space, there's no way the app can run (stable and with the same datasets) on 32bit. In this case the only thing that makes sense is target just 64bit (what Microsoft did with Exchange 2007)
February 11, 2009 14:29
There is one gotcha when debugging managed ASP.NET applications on 64-bit. If you attempt to attach to the web server process to debug your managed code, Visual Studio may assume you want to debug the native 64-bit code first an throw a weird error at you. You have to specify that you want to debug the managed code running in the process. Also, edit-and-continue is not available and you can't debug in mixed-mode.

http://www.lazycoder.com/weblog/2008/09/12/debugging-managed-web-applications-on-x64/
http://msdn.microsoft.com/en-us/vstudio/aa718629.aspx
February 11, 2009 14:34
stusmith - If Edit and Continue is your thing, you can enable it by setting the Platform Target to x86 during development -- just remember to set it back to whatever it really should be before you ship and test :)

Kirk
February 11, 2009 14:44
I was all good until I saw this one:

"...Using serialization as a way of persisting state.."


Would you maybe know why this could be a problem?


Thanks,
Vladan
February 11, 2009 14:54
@Fowl - The purpose of IntPtr is to be exactly large enough to represent a pointer. "How big does a pointer need to be in this process?" and "How many bits is this process?" are just two ways of asking exactly the same question.
February 11, 2009 15:09
Something else to consider when running studio on 64 bit machines are third party libraries. It's not 64 bit per say, it's the newer OS needed. I just fought through getting an older version of Infragistics working on a Server 2008 system I setup. There are strange hoops I needed to jump through to get it loaded. This is all a requirement of the project I'm working on where I need some older stuff. If I was able to move to the latest and greatest Infragistics I would have been fine (or at least their docs say so: ).

But keep this in mind when thinking about "making the jump", you might have to play around to find just the right secret handshake to get it all working. There must be some configurations that just won't work but so far I've been quite successful.

As an aside, running Windows 2008 server as a workstation OS is working out great so far.

John

February 11, 2009 16:46
One other thing worth mentioning is that there are multiple GAC's on x64. This was a gotcha for me when I had a third-party binary that required me to compile in x86 I didn't realize that there was a separate GAC for x86 and x64.
February 11, 2009 17:19
I'd like to point out that XNA apps have to be 32 bit.
February 11, 2009 17:40
@Vladan Strigo
Migration considerations

So after all that, what are the considerations with respect to serialization?

* IntPtr is either 4 or 8 bytes in length depending on the platform. If you serialize the information then you are writing platform-specific data to the output. This means that you can and will experience problems if you attempt to share this information.
February 11, 2009 19:08
The most important take away, from MSDN:
"If you have 100% type safe managed code then you really can just copy it to the 64-bit platform and run it successfully under the 64-bit CLR."
If you do that, you're golden. Moving right along.

If only it were that true. It all pre-supposes that any .NET assembly added to the mix is also 32/64-bit transparent. I'm currently dealing with this issue with IBM's Informix drivers for .NET. The 64-bit doesn't quite register itself properly and so you end up having to link directly to it thus causing issues for a 32-bit only user.
February 11, 2009 19:42
Excellent article; I needed this summary a few years back, and all that was available was detailed documentation.
February 11, 2009 19:45
Just to add yet another thing to the "visual studio works the same under x64 _EXCEPT_...." list we're making here: Code Coverage doesn't work under x64. According to this blog post http://blogs.msdn.com/hilliao/archive/2008/08/19/code-coverage-using-visual-studio-2008-s-code-coverage-tools.aspx the "Visual Studio team decided to proactively deny instrumenting 64-bit assemblies for code coverage in Visual Studio 2008" - nice work lads, way to help the world move forward.
February 11, 2009 20:06
Good post Scott. I did want to point out one thing though with some terminology:


If you checked the property System.IntPtr.Size while running as x86, it'll be 4. It'll be 8 while under x64. To be clear, if you are running x86 even on an x64 machine, System.IntPtr.Size will be 4. This isn't a way to tell the bitness of your OS, just the bitness of your running CLR process.


I found the term "bitness" to be a little confusing. I think maybe the terms "bit length" or "bit size" are more descriptive of what's causing the compatibility errors in the cases that you mentioned. Otherwise, good stuff as usual.
February 11, 2009 20:28
Vladan - Only binary serialization and only if you're being tricky with packing and structs.

Terry - Point taken.

Joseph Cooney - Agreed, that's intensely lame. I don't use the built in code coverage, but I'll ask the team if that's fixed going forward.

KirkJ - Exactly what I do if I need E&C. I'll ask if it's turned on in VS10.

Rupurt - Correct. I didn't get into that for fear I'd lose people, but marshaling structs IS a problem.
February 11, 2009 23:23
"Itanium is dead" Do you mean .NET (including ASP .NET) is dead on Itanium, and desktop applications are dead on Itanium? I completely agree there. As replacement that as a replacement for PA-RISC on HP-UX and for MIPS on HP NonStop it seems to be living pretty well. Oracle still seems to be supporting Itanium well, and even SQL Server doesn't seem to have abandoned it.

http://www.microsoft.com/servers/64bit/itanium/overview.mspx

Who knows, that stuff could go to x64 eventually.
February 11, 2009 23:35
I ran into a problem on using x64 when communicating with the JET driver in ODBC to load data from an excel spreadsheet into a DataSet. I had to tell the compiler to target x86 to make it work...
February 12, 2009 2:47
I was just about to say what Mike R said. You may be indirectly using 32 bit DLL's if you're using ODBC or OLEDB. In my case, Firebird and Foxpro connectivity doesn't work under x64.
February 12, 2009 2:48
Oh, and I almost forgot....Why the heck do the Team Foundation client dlls need to be x86 only? They wrap a bunch of webservice calls to TFS (and do a nice job of it) but (although I haven't decompiled them to check) I doubt they need to do anything unmanaged. Just because Visual Studio is stuck in the x86 tar-pit doesn't mean everything else in the tool-chain should be. I need to see if this has been fixed in dev10 too.
February 12, 2009 3:08
This article saved me a ton of x86 to x64 conversion headaches: the method of passing p/Invoke parameters changed with the x64 CLR. This hit us badly because we were in the habit of using ref array[0] to get a pointer to an array, which simply doesn't work anymore. You have to set proper properties on parameters.
February 12, 2009 3:13
Hi Scott,
How about x64 on windows azure? Could I still upload apps using x32 OS and IDE without
encountering problem, having their tools for VST2008?

February 12, 2009 8:35
There are a lot of hazards to walk through when doing P/Invokes. Some Win32 functions are implemented as #defines in x86, but are DLL exports in x64.

For example: GetWindowLongPtrW in user32.dll. It doesn't actually exist in x86. Instead, winuser.h #define's it straight over to GetWindowLongW.

(I'm actually debugging this right now in an alpha build of Paint.NET v3.5 :) )
February 12, 2009 23:30
Re: Itanium. Well, it's definitely dead for desktop/client apps.*

Paint.NET v2.6 actually shipped with IA64 support, about 3 years ago. I had written a little utility that would ping my webstats and produce a barchart showing the distribution of usage with respect to OS, based on which update manifests had been touched (they're just text files, and the file names are decorated w/ locale, OS version, and x86/x64/ia64). The most prevalent platform was, of course, Windows XP x86. That update manifest had several hundred thousand downloads over the course of a month or so.

The IA64 update manifest had been downloaded precisely one time over the entire previous year. I dropped support for it in the very next release.

It was kind of fun though -- I had remote desktop access to an Itanium at one point. Paint.NET worked fine on it, there just wasn't any point.

* IMO.
February 15, 2009 8:06
Hi Scott, a bit of a shot, but does this ("If you have 100% type safe managed code then you really can just copy it to the 64-bit platform and run it successfully under the 64-bit CLR.") mean that .NET purity is not a myth? You really can do a lot without P/Invoke, and then these issues don't come up. -- Jeff
February 18, 2009 0:41
Jeff - Well, in fact the P/Invoking is happening in the BCL then. It's just hidden away in the 32- or 64-bit version of the .NET Framework. Your code is 'pure,' but at SOME point, someone has to call LoadLibrary(), right? ;)
February 19, 2009 18:55
My application refers Visual C redists like msvcr80.dll at runtime. Do I need to have two separate codebases for x86 and x64?
February 19, 2009 18:59
Also, is there any tool available (similar to FxCop for managed) to analyze the unmanaged code portability issues? I have thousands of lines written in C++ :(

Comments are closed.

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.